HOME | RESEARCH | TEACHING |
---|
In this course we will look at a handful of ubiquitous algorithms in machine learning. We will cover several classical tools in machine learning but more emphasis will be given to recent advances and developing efficient and provable algorithms for learning tasks. A tentative syllabus/schedule can be found below; the topics may change based on student interests as well.
There will be four assignments for the course that will cover 70% of the credit and a final that will cover 25% of the credit; 5% is for class participation (live-class participation and/or extensive contributions to online discussions). Each assignment will have both written as well as programming components. We'll predominantly use gradescope for the assignments. The final will be online on gradescope (and will be released as per university schedule). You could potentially discuss how to solve problems with others specifically (a better option would be to just ask questions openly on edStem) but see course policies below and make sure you do not violate them.
Some background in algorithms, probability, linear algebra (all at a basic undergraduate level) will be quite helpful.
โ Assignment 0 (not to be submitted)
โ [Assignment 1]: Lectures 1-4. (18 points)
โ [Assignment 2]: Lectures 5-8. (18 points).
โ [Assignment 3]: Lectures 9-12. (17 points)
โ [Assignment 4]: Lectures 12-16. (17 points)
Solutions will be sketched on the discussion forum after the due date.
You can register here.
We will make use of edStem extensively. Videos will also be uploaded here. You must have received an invitation via your official UCLA email ID (the one used on CCLE). If you haven't let me know immediately. It will be our main way for communicating with each other. Please ask questions on edStem so that others may also benefit from the knowledge.
We will also use it for releasing homework solutions. Homeworks themselves will be posted on Gradescope.
We will use Gradescope for homework and they have to be submitted by 10PM on their due date. Things to keep in mind: 1) Within a week of the course, you should receive a registration link from Gradescope. If you don't receive it before the first homework, contact me; this will give you access to the website. 2) Watch this one-minute video with complete instructions. Follow them to the letter! The simple guidelines make the process considerably smoother. 3) Make sure you start each problem of a homework on a new page. 4) To generate a PDF scan of the assignments, you can follow the instructions here; you can also use the scanners in the library.
It is strongly recommended to use LaTeX or other word processing software for submitting the homework. Grades will take into account both the correctness and the quality of the solutions. Correctness is a prerequisite but clarity is also important: you are responsible for communicating your solution in a legible and understandable way.
Some helpful guidelines: (1) Start early to use office hours. (2) Serious/honest attempts count - there will be reasonable partial credit for attempts that show understanding of the problem/concepts involved.
โ Sanjeev Arora's course
โ Moritz Hardt's course
โ Elad Hazan's course
โ Foundations of Data Science by Blum, Hopcroft and Kannan.
Here is a tentative list of topics for the course. I will adapt some of these topics based on student interest.
โ Learning as optimization
โ Gradient descent
โ Stochastic gradient descent
โ Accelrated gradient descent methods
โ Autograd and Adagrad
โ Online optimization and regret minimization
โ Multiplicative weights
โ Boosting
โ Best-fit subspaces, low-rank approximations
โ Computing and applying Singular Value Decomposition
โ Modeling dependencies
โ Learning and inference on graphical models
โ GLMs and Sparsitron
โ How hard is it for learning to violate privacy?
โ Models of privacy: Differential privacy. Laplace mechanism
โ Differentially private SGD.