Deep learning has recently shown much promise for NLP applications.
Traditionally, in most NLP approaches, documents or sentences are represented
by a sparse bag-of-words representation. There is now a lot of work, including
at Stanford, which goes beyond this by adopting a distributed
representation of words, by constructing a so-called "neural
embedding" or vector space representation of each word or document.
Beyond this, Stanford work at the intersection of
deep learning and natural language processing has in particular aimed at
handling variable-sized sentences in a natural way, by capturing the
recursive nature of natural language. We explore recursive neural
networks for parsing, paraphrase detection of short phrases and longer
analysis, machine translation, and natural language inference. Our
approaches go beyond learning word vectors and also
learn vector representations for multi-word phrases, grammatical
relations, and bilingual phrase pairs, all of which are useful
for various NLP applications.
Samuel R. Bowman, Gabor Angeli, Christopher Potts, and Christopher D. Manning. 2015. A large annotated corpus for learning natural language inference. In Conference on Empirical Methods in Natural Language Processing (EMNLP 2015). [pdf] [corpus page]
Samuel R. Bowman, Christopher D. Manning, and Christopher Potts. 2015. Tree-structured composition in neural networks without tree-structured architectures. arXiv manuscript 1506.04834. [pdf]
Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Chris Manning,
Andrew Ng and Chris Potts. 2013. Recursive Deep Models for Semantic
Compositionality Over a Sentiment Treebank. Conference on Empirical
Methods in Natural Language Processing (EMNLP 2013).
Will Zou, Richard Socher, Daniel Cer and Christopher Manning, "Bilingual Word Embeddings for Phrase-Based Machine Translation"
Richard Socher, John Bauer, Christopher D. Manning and Andrew Y. Ng, "Parsing with Compositional Vector Grammars"
Thang Luong, Richard Socher, Christopher D. Manning. 2013. Better
Word Representations with Recursive Neural Networks for Morphology.
Conference on Computational Natural Language Learning (CoNLL 2013).
Danqi Chen, Richard Socher, Christopher D. Manning, Andrew Y. Ng, "Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors"
Mengqiu Wang and Christopher D. Manning, "Effect of Non-linear Deep Architecture in Sequence Labeling", ICML 2013 Workshop on Deep Learning for Audio, Speech and Language Processing
Richard Socher, Milind Ganjoo, Hamsa Sridhar, Osbert Bastani, Christopher D. Manning, Andrew Y. Ng, "Zero-Shot Learning Through Cross-Modal Transfer"
Richard Socher, Brody Huval, Christopher D. Manning and Andrew Y. Ng, "Semantic Compositionality through Recursive Matrix-Vector Spaces"
Eric H. Huang, Richard Socher, Christopher D. Manning and Andrew Y. Ng, "Improving Word Representations via Global Context and Multiple Word
Prototypes", ACL 2012
Richard Socher, Eric Huang, Jeffrey Pennington, Andrew Y. Ng, and
Christopher D. Manning. 2011. Dynamic Pooling and Unfolding
Recursive Autoencoders for Paraphrase Detection.
Advances in Neural Information Processing Systems (NIPS 2011).
Richard Socher, Jeffrey Pennington, Eric Huang, Andrew Y. Ng, and Christopher D. Manning, "Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions". EMNLP 2011.
Richard Socher, Cliff Lin, Andrew Y. Ng, and Christopher D. Manning, "Parsing Natural Scenes and Natural Language with Recursive Neural Networks". ICML 2011 (Distinguished Application Paper Award)
Richard Socher, Christopher D. Manning, Andrew Y. Ng, "Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks". NIPS Deep Learning and Unsupervised Feature Learning Workshop 2010.