(under construction … see also Controlled Natural Language, Attempto, Semantic Parsing, Transformers)
Lectures and Courses on NLP
Books on NLP
Deep Learning
Neural Machine Translation: A Review and Survey, 2020.
Universal Dependencies: A cross-linguistic typology
Globally Normalized Transition-Based Neural Networks
Anaphora and Coreference Resolution: A Review
Stanford Log-linear Part-Of-Speech Tagger
Watson and Question Answering … DeepQA …
Learning Context-Free Grammars: Wikipedia
Yes you should understand backprop
What is a language? There is a language in the real world. There is language as a probability distribution. And there is langauge as given by data. So we need to think about at least three different notions. [1]
There is a metaphysical problem here. Are the languages in the real world real? Does English exist? Or is "English" just a name without a precise meaning that we use to communicate a variety of vage phenomena? Certainly we are confident that we understand sentences such as "I translated this text from English to German." But that does not mean that I can give a definition of "English", that I can answer the question: What is English? ↩︎