Try   HackMD

(under construction)

Introductory Materials

Articles

Original Articles

These two papers introduce "attention" to NLP:

This paper is credited with introducing the "transformer":

GPT uses transformers to learn a language model.

BERT is another transformer based language model:

More Applications of Transformers to NLP:

Transformers beat CNNs for image recognition:

Transformers for composing and performing music:

Proteinfolding:

Ethics:

Surveys

Criticism

Symbolic Regression with genetic algorithms (interpretable, bad at high-dimension problems) and deep learning (good at high-dimension problem). Data -> NN -> SR. [1] Afaiu, what is nice here is that the NN itself has a physical interpretation. See also AI Feynman.

AI Feynman.


  1. ↩︎