--- tags: NLP, 298, transformer --- # Introduction to [Transformers](https://www.google.com/search?q=transformers&rlz=1C1VDKB_enUS977US977&sxsrf=APq-WBsVPM867A82IV19WyDgbA1XYOqJIA:1646786930957&source=lnms&tbm=isch&sa=X&ved=2ahUKEwiq16iw57f2AhWkmGoFHZ28BvkQ_AUoAnoECAEQBA&biw=1536&bih=722&dpr=1.25) The main part of the lecture we spent on the demos section in order to give you a sense of how far the state of the art in AI advanced in the last few years. But we start with a short overview of some technical background. ## Technical Background Transformers are a simplification ("attention is all you need") of previous versions of deep neural networks (DNNs) for natural language processing (and other tasks including image processing). In class we gave a quick introduction to DNNs and backpropagation and hinted at the differences between previous architectures known as recurrent neural networks (RNNs) and Transformers. ## Demos - [Core NLP](https://stanfordnlp.github.io/CoreNLP/index.html) ([Demo](https://corenlp.run/)) - [Google Translate](https://translate.google.com/) - [OpenAI](https://beta.openai.com/) ([Examples](https://beta.openai.com/examples), [Playground](https://beta.openai.com/playground)) - [GitHub Copilot](https://github.com/github/copilot-docs/blob/main/docs/visualstudiocode/gettingstarted.md#installing) ## Thinking About the Future of AI Given the amazing progress we have seen in the demos, what is the future of AI (and humanity)? Here are some topics to think about. Add your own. - How will AI influence the job market? - Artificial General Intelligence (AGI): Will software become more intelligent than humans? If yes, when will this happen? What will be the consequences for us humans? What is the AI singularity? - Where will the AI arms race take us that has started between countries like the US and China? - What is the environmental impact of AI? [^environmental] [^environmental]: Much of the progress comes at an exponentially increasing cost (energy consumption, carbon footprint, etc). So while AI has been quickly catching up with the human brain in performance, the gap in resource consumption has been widening enormously. - [Training a single AI model can emit as much carbon as five cars in their lifetimes](https://www.technologyreview.com/2019/06/06/239031/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes) ## Homework <font color=red>Choose one of the prompts below and post on the slack channel before the next lecture (3/30). Also choose one post by another student and reply with an opinion of your own by the end of the following Friday. (4/1) - A link to an article about a recent application of AI that you find interesting, with a short opinion of your own. - A link to an article about the future of AI, with a short opinion of your own. </font> Think about possible project topics. Here are some ideas. (We will talk about this in more detail after the spring break.) - Use Grammatical Framework to build a recipe translator. - Use a language model for one of various NLP tasks such as sentiment analysis or topic detection. - Can we measure the distance between languages by using Google Translate (or OpenAI) to translate back and forth between two languages and then measure the similarity? [^fixedpoint] [^measure] - ... (let us know about your ideas) ... [^fixedpoint]: It is also interesting to measure how long it takes to reach fixed point. [^measure]:(Remember that measuring similarlity can be done with language models.) ## Further Sources ### Introductions and Examples - [6 min video](https://jalammar.github.io/illustrated-bert/) overview of some of the applications of transformers as well as an overview of some of the populat architectures. - Quanta Magazine: [Will Transformers Take Over Artificial Intelligence?](https://www.quantamagazine.org/will-transformers-take-over-artificial-intelligence-20220310) - [AlphaFold, GPT-3 and How to Augment Intelligence with AI](https://future.a16z.com/alphafold-gpt-3-and-how-to-augment-intelligence-with-ai/) and [Pt.2](https://future.a16z.com/alphafold-gpt-3-and-how-to-augment-intelligence-with-ai-pt-2/). ### More Examples - A [Guardian article](https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3) written by GPT-3. - A [video](https://www.youtube.com/watch?v=TfVYxnhuEdU) by Tom Scott about GPT. ### Introductory Technical Background [](https://d2l.ai/chapter_natural-language-processing-pretraining/bert.html) ### Research In this short course, we don't have the time to introduce the mathematics needed to understand the original research. But if you want to dive deeper in the future it can't harm to read introductions and conclusions of the articles and build a mental landscape which you can fill later with the mathematical details. [^experiments] [^hardware] [^experiments]: Many of the articles also contain links to git repositories. A good hands-on way to learn more is to see whether you can recreate (variations of) the experiments and results reported in the papers. This can take a lot of work, but is a great way to learn. [^hardware]: If you read some of the articles try to get a sense for how much of the progress is driven by improving hardware. #### Theory of Transformers These two papers introduce "attention" to NLP: - Bahdanau etal, [Neural Machine Translation by Jointly Learning to Align and Translate](https://arxiv.org/pdf/1409.0473.pdf), 2014. "Neural machine translation is a newly emerging approach to machine translation. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation." - Luong etal, [Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/pdf/1508.04025.pdf), 2015. "In this work, we design, with simplicity and effectiveness in mind, two novel types of attentionbased models" This paper, building on the previous two, is credited with introducing the "transformer": - Vaswani et al, [Attention is all you need](https://papers.nips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf), 2017. See also on the [annotated-transformer](https://github.com/harvardnlp/annotated-transformer) on github. "In this work we propose the Transformer, a model architecture eschewing recurrence and instead relying entirely on an attention mechanism to draw global dependencies between input and output. The Transformer allows for significantly more parallelization and can reach a new state of the art in translation quality after being trained for as little as twelve hours on eight P100 GPUs." BERT was the breakthrough transformer setting new standards. [Github](https://github.com/google-research/bert). GPT uses transformers to learn a language model. Radford etal, [Language Models are Unsupervised Multitask Learners](https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf), 2018. T5 (Text-To-Text Transfer Transformer). [Video](https://www.youtube.com/watch?v=eKqWC577WlI) by Colin Raffel. [Google AI Blog](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html). Facebook's [XLM-R](https://github.com/pytorch/fairseq/blob/main/examples/xlmr/README.md). #### Applications of Transformers - More Applications of Transformers to NLP: - [Language Models are Few-Shot Learners](https://arxiv.org/pdf/2005.14165.pdf), 2020. - [Neural Databases](https://arxiv.org/pdf/2010.06973.pdf), 2020. - [Introducing FLAN: More generalizable Language Models with Instruction Fine-Tuning](https://ai.googleblog.com/2021/10/introducing-flan-more-generalizable.html), 2021. - Transformers beat CNNs for image recognition: Dosovitskiy etal, [An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale](https://arxiv.org/pdf/2010.11929.pdf), 2021. - Transformers for composing and performing music: [Music Transformer](https://arxiv.org/pdf/1809.04281.pdf), 2018. [Magenta blog](https://magenta.tensorflow.org/music-transformer) ... [demos](https://magenta.tensorflow.org/demos/) ... [github](https://github.com/magenta/magenta). - Proteinfolding: [AlphaFold](https://en.wikipedia.org/wiki/AlphaFold)