Deep Learning in Natural Language Processing
Overview
Deep learning has recently shown much promise for NLP applications.
Traditionally, in most NLP approaches, documents or sentences are represented
by a sparse bag-of-words representation. There is now a lot of work, including
at Stanford, which goes beyond this by adopting a distributed
representation of words, by constructing a so-called "neural
embedding" or vector space representation of each word or document.
Beyond this, Stanford work at the intersection of
deep learning and natural language processing has in particular aimed at
handling variable-sized sentences in a natural way, by capturing the
recursive nature of natural language. We explore recursive neural
networks for parsing, paraphrase detection of short phrases and longer
sentences, sentiment
analysis, machine translation, and natural language inference. Our
approaches go beyond learning word vectors and also
learn vector representations for multi-word phrases, grammatical
relations, and bilingual phrase pairs, all of which are useful
for various NLP applications.
People
Alumni
Papers
-
Samuel R. Bowman, Gabor Angeli, Christopher Potts, and Christopher D. Manning. 2015. A large annotated corpus for learning natural language inference. In Conference on Empirical Methods in Natural Language Processing (EMNLP 2015). [pdf] [corpus page]
-
Samuel R. Bowman, Christopher D. Manning, and Christopher Potts. 2015. Tree-structured composition in neural networks without tree-structured architectures. arXiv manuscript 1506.04834. [pdf]
-
Samuel R. Bowman, Christopher Potts, and Christopher D. Manning. 2015. Recursive Neural Networks Can Learn Logical Semantics. Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality. [pdf] [code and data]
-
Samuel R. Bowman, Christopher Potts, and Christopher D. Manning. 2015. Learning Distributed Word Representations for Natural Logic Reasoning. Proceedings of the AAAI Spring Symposium on Knowledge Representation and Reasoning.
[pdf]
-
Danqi Chen and Christopher Manning. 2014. A Fast and Accurate Dependency
Parser using Neural Networks. In EMNLP 2014.
[pdf]
-
Jeffrey Pennington, Richard Socher and Christopher Manning. 2014.
Glove: Global Vectors for Word Representation.
In Conference on Empirical Methods in Natural Language
Processing (EMNLP 2014).
[pdf,
website with word vectors].
-
Samuel R. Bowman. 2013. Can recursive neural tensor networks learn logical reasoning?
arXiv:1312.6192. [pdf]
-
Richard Socher,
Danqi Chen,
Christopher D. Manning,
and Andrew Y. Ng.
2013.
Reasoning With Neural Tensor Networks For Knowledge Base Completion.
In Advances in Neural
Information Processing Systems 26.
[pdf]
-
Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Chris Manning,
Andrew Ng and Chris Potts. 2013. Recursive Deep Models for Semantic
Compositionality Over a Sentiment Treebank. Conference on Empirical
Methods in Natural Language Processing (EMNLP 2013).
pdf, [website],
[demo]
-
Will Zou, Richard Socher, Daniel Cer and Christopher Manning, "Bilingual Word Embeddings for Phrase-Based Machine Translation"
[pdf],
-
Richard Socher, John Bauer, Christopher D. Manning and Andrew Y. Ng, "Parsing with Compositional Vector Grammars"
[pdf],
-
Thang Luong, Richard Socher, Christopher D. Manning. 2013. Better
Word Representations with Recursive Neural Networks for Morphology.
Conference on Computational Natural Language Learning (CoNLL 2013).
[pdf],
-
Danqi Chen, Richard Socher, Christopher D. Manning, Andrew Y. Ng, "Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors"
[pdf],
-
Mengqiu Wang and Christopher D. Manning, "Effect of Non-linear Deep Architecture in Sequence Labeling", ICML 2013 Workshop on Deep Learning for Audio, Speech and Language Processing
[pdf],
-
Richard Socher, Milind Ganjoo, Hamsa Sridhar, Osbert Bastani, Christopher D. Manning, Andrew Y. Ng, "Zero-Shot Learning Through Cross-Modal Transfer"
[pdf],
-
Richard Socher, Brody Huval, Christopher D. Manning and Andrew Y. Ng, "Semantic Compositionality through Recursive Matrix-Vector Spaces"
[pdf], [website]
-
Eric H. Huang, Richard Socher, Christopher D. Manning and Andrew Y. Ng, "Improving Word Representations via Global Context and Multiple Word
Prototypes", ACL 2012
[pdf], [website]
-
Richard Socher, Eric Huang, Jeffrey Pennington, Andrew Y. Ng, and
Christopher D. Manning. 2011. Dynamic Pooling and Unfolding
Recursive Autoencoders for Paraphrase Detection.
Advances in Neural Information Processing Systems (NIPS 2011).
NIPS 2011
[pdf], [website]
-
Richard Socher, Jeffrey Pennington, Eric Huang, Andrew Y. Ng, and Christopher D. Manning, "Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions". EMNLP 2011.
[pdf], [website]
-
Richard Socher, Cliff Lin, Andrew Y. Ng, and Christopher D. Manning, "Parsing Natural Scenes and Natural Language with Recursive Neural Networks". ICML 2011 (Distinguished Application Paper Award)
[pdf], [website]
-
Richard Socher, Christopher D. Manning, Andrew Y. Ng, "Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks". NIPS Deep Learning and Unsupervised Feature Learning Workshop 2010.
[pdf]
Tutorial
There's a separate page for our tutorial on Deep Learning for NLP.
Contact Information
For any comments or questions, please feel free to email danqi at cs
dot stanford dot edu.