---
tags: nlp 298, word embedings, word2vec, haub and kurz
---
# NLP 298 Word Embeddings
Introduction to NLP with wordembeddings
- Google, Machine Learning Foundations (with Python notebooks).
- [Ep #8 - Tokenization for Natural Language Processing](https://www.youtube.com/watch?v=f5YJA5mQD5c)
- [Ep #9 - Using the Sequencing APIs](https://www.youtube.com/watch?v=L3suP4g8p7U)
- [Ep #10 - Using NLP to build a sarcasm classifier](https://www.youtube.com/watch?v=-8XmD2zsFBI)
A deeper look at wordembeddings, based on [CS224N Assignment 1: Exploring Word Vectors](https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/assignments/a1_preview/exploring_word_vectors.html).
- [Count based word vectors](https://hackmd.io/@alexhkurz/S1Jldeyeq)
- [Word2Vec](https://hackmd.io/@alexhkurz/H1IJ4qxlq)
## References
Kirk Baker: [Singular Value Decomposition Tutorial](https://davetang.org/file/Singular_Value_Decomposition_Tutorial.pdf), 2005.
Steven Bird, Ewan Klein, and Edward Loper: [Natural Language Processing with Python](https://www.nltk.org/book/ch02.html), Chapter 2, 2019.
[Python Numpy Tutorial (with Jupyter and Colab)
](https://cs231n.github.io/python-numpy-tutorial/)
[NymPy Arrays](https://jakevdp.github.io/PythonDataScienceHandbook/02.02-the-basics-of-numpy-arrays.html). All elements need to have the same type. Slices are views, not copies.
Chris McCormick [Word2Vec Tutorial - The Skip-Gram Model](http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/)