Chris Manning works on systems and formalisms that can intelligently process and produce human languages. His research concentrates on probabilistic models of language and statistical natural language processing; including text understanding, text mining, machine translation, information extraction, named entity recognition, part-of-speech tagging, probabilistic parsing and semantic role labeling, syntactic typology, computational lexicography, and other topics in computational linguistics and machine learning.
|M||Dept of Computer Science, Gates Building 1A, 353 Serra Mall, Stanford CA 94305-9010, USA|
|W||+1 (650) 723-7683|
|F||+1 (650) 725-1449|
|A||Prachi Balaji, Gates 150, +1 (650) 725-3358, firstname.lastname@example.org|
Most of my papers are available online in my publication list. Or things might be more up-to-date at Google Scholar or Microsoft Academic Search.
My new book, an Introduction to Information Retrieval, with Hinrich Schütze and Prabhakar Raghavan, is now available in print. My "bestseller" is Manning and Schütze, Foundations of Statistical Natural Language Processing (MIT Press, 1999). My two monographs are Ergativity: Argument Structure and Grammatical Relations and Complex Predicates and Information Spreading in LFG.
I'm one of the program co-chairs for the new International Conference on Learning Representations. Most exciting! Get involved.
A few talks are available online.
My current research focuses on robust but linguistically sophisticated probabilistic natural language processing, and opportunities to use it in real-world domains. Particularly topics include richer models for probabilistic parsing, computational semantics, machine translation, grammar induction, text categorization and clustering, incorporating probabilistic models into syntactic theories, electronic dictionaries and their usability, particularly for indigenous languages, information extraction and presentation, and linguistic typology.
My research at Stanford is currently supported by an IBM Faculty Partnership Award, IARPA, DARPA, and the NSF.
I am interested in new students wanting to work in the area of Natural Language Processing. To find out more about what I do, it's best to look at my papers, or my group research page. However, to cut down on my email load, it's necessary to put in some more information:
Starting January 2012, I'll be teaching a free online course on Natural Language Processing with Dan Jurafsky.
In June 2011, I taught a tutorial Natural Language Processing Tools for the Digital Humanities at Digital Humanities 2011 at Stanford.
In fall 2008, I will again teach CS 276: Information Retrieval and Web Search, with Prabhakar Raghavan Earlier versions of this course include two years of two-quarter sequences CS276A/B on information retrieval and text information classification and extraction, broadly construed ("IR++"): Fall quarter course website. Winter quarter course website. This course started in 2001. Early versions were also co-taught by Hinrich Schütze.
In spring 2009, I will again teach CS 224N / Ling 237. Natural Language Processing -- Develops an in-depth understanding of both the algorithms available for the processing of linguistic information and the underlying computational properties of natural languages. Morphological, syntactic, and semantic processing from both a linguistic and an algorithmic perspective. Focus on modern quantitative techniques in NLP: using large corpora, statistical models for acquisition, disambiguation, and parsing. Examination and construction of representative systems. Prerequisites: 121/221 or Ling 138/238, and programming experience. Recommended: basic familiarity with logic and probability. 3 units. I've taught this course yearly since Spr 2000. Many previous student projects are available online.
In fall 2007 I taught Ling 289: Quantitative and Probabilistic Explanation in Linguistics MW 2:15-3:45 in 160-318. I previously taught it in winter 2002 (née Ling 236) and Winter 2005 (as Ling 235).
In the summer of 2007, I taught at the LSA Linguistic Institute: Statistical Parsing and Computational Linguistics in Industry.
In fall 1999 and winter 2001, I taught CS 121 Artificial Intelligence. The text book was S. Russell and P. Norvig, Artificial Intelligence: A Modern Approach.
The NLP Reading Group (which I ran 1999-2002) has now been transformed into The Natural Language and Speech Processing Colloquium.
LaTeX: When I used to have more time (i.e., when I was a grad student), I used to spend some of it writing (La)TeX macros. [Actually, that's a lie; I still spend some time doing it....]
We've now got two sons: Joel and Casey. Here are my opinions on books for the very young.