HOME | RESEARCH | TEACHING |
---|
In this course we will look at a handful of ubiquitous algorithms in machine learning. We will cover several classical tools in machine learning but more emphasis will be given to recent advances and developing efficient and provable algorithms for learning tasks. A tentative syllabus/schedule can be found below; the topics may change based on student interests as well. You can also check last year's course notes for a more details about what's to come.
Make sure you can answer the questions in Assignment 0 for yourselves before registering for the course.
There will be three assignments for the course that will cover 50% of the credit and we will have two (non-cumulative) exams that have 25% weight each. Each assignment will have both written as well as programming components. We'll predominantly use gradescope for the assignments. The final will be as per university schedule. You could potentially discuss how to solve problems with others specifically (a better option would be to just ask questions openly on edStem) but see course policies below and make sure you do not violate them.
Background in algorithms, probability, linear algebra (all at a basic undergraduate level) is required and will be assumed through the course.
Please click on triangle to the right to expand the sub-heading and see more details (if any).
โ Each lecture is on a separate page.
โ Part 1: Optimization
โ Part 2: PCA
โ Part 3: Privacy
โ Yimeng Wang: Monday 7-8PM, Thursday 12-1PM, Friday 4-5PM
โ Exam 1: May 7, 12-1:50 in class for in-person class. Lectures 1-9.
โ Exam 1 MSOL: May 7, 2pm release on gradescope. MSOL students will have a 72-hour window within which to start the exam and once they start, they'll have two hours to submit. Please adhere to the honor code for the exam: no collaborating with other students. You can use notes but no online resources.
โ Exam 2: June 11, 3-5pm (university schedule). Location to be decided.
โ Exam 2 MSOL: June 11, 5pm release on gradescope. MSOL students will have a 72-hour window within which to start the exam and once they start, they'll have two hours to submit. Please adhere to the honor code for the exam: no collaborating with other students. You can use notes but no online resources.
โ [Asssignment 1] Lectures 1-5: April 14, 6pm. Due April 21, 10pm. [14 points]
โ [Assignment 2] Lectures 6-9: April 28, 6pm. Due May 5, 10pm. [11 points]
โ [Assignment 3] Lectures 10-14: May 19, 6pm. Due May 28, 10pm. [14 points]
โ [Assignment 4] Lectures 15-18: June 2, 6pm. Due June 9, 10pm. [11 points]
Solutions will be sketched on the discussion forum after the due date.
We will make use of edStem extensively. Videos will also be uploaded here. You must have received an invitation via your official UCLA email ID (the one used on CCLE). If you haven't let me know immediately. It will be our main way for communicating with each other. Please ask questions on edStem so that others may also benefit from the knowledge.
We will also use it for releasing homework solutions. Homeworks themselves will be posted on Gradescope.
Registration links will be provided soon.
We will use Gradescope for homework and they have to be submitted by 10PM on their due date. Things to keep in mind: 1) Within a week of the course, you should receive a registration link from Gradescope. If you don't receive it before the first homework, contact me; this will give you access to the website. 2) Watch this video with instructions. Follow them to the letter! The simple guidelines make the process considerably smoother. 3) Make sure you start each problem of a homework on a new page. 4) To generate a PDF scan of the assignments, you can follow the instructions here; you can also use the scanners in the library.
It is strongly recommended to use LaTeX or other word processing software for submitting the homework. Grades will take into account both the correctness and the quality of the solutions. Correctness is a prerequisite but clarity is also important: you are responsible for communicating your solution in a legible and understandable way.
Some helpful guidelines: (1) Start early to use office hours. (2) Serious/honest attempts count - there will be reasonable partial credit for attempts that show understanding of the problem/concepts involved.
โ Sanjeev Arora's course
โ Moritz Hardt's course
โ Elad Hazan's course
โ Foundations of Data Science by Blum, Hopcroft and Kannan.
Here is a tentative list of topics for the course. I will adapt some of these topics based on student interest.
โ Learning as optimization
โ Gradient descent
โ Stochastic gradient descent
โ Accelarated (Momentum, Nesterov) gradient methods
โ Adaptive gradient descent methods (Adagrad, Adam)
โ Optimization in practice: Compare GD, SGD, Momentum, NAGD, ADAM, Shampoo
โ Online optimization and regret minimization
โ Multiplicative weights
โ Stochastic bandits and UCB algorithm
โ Best-fit subspaces, LoRA
โ Computing and applying Singular Value Decomposition
โ How hard is it for learning to violate privacy?
โ Models of privacy: Differential privacy. Laplace mechanism
โ Differentially private SGD.
โ Transformers
โ Representational power of transformers
โ Locality Sensitive Hashing and KV Cache