Neural Networks and Deep Learning
Lecture, four hours; discussion, two hours; outside study, six hours. Requisites: courses 131A, 133A or 205A, and M146, or equivalent. Review of machine learning concepts; maximum likelihood; supervised classification; neural network architectures; backpropagation; regularization for training neural networks; optimization for training neural networks; convolutional neural networks; practical CNN architectures; deep learning libraries in Python; recurrent neural networks, backpropagation through time, long short-term memory and gated recurrent units; variational autoencoders; generative adversarial networks; adversarial examples and training. Concurrently scheduled with course C247. Letter grading.
Review Summary
- Clarity
-
10.0 / 10
- Organization
-
10.0 / 10
- Time
-
5-10 hrs/week
- Overall
-
10.0 / 10
Reviews
Kao a is an absolutely fantastic professor. His lectures are clear and engaging, and manage to break difficult concepts down into understandable chunks. He provides excellent slides, both annotated from class and unannotated originals, which are wonderful for studying. His slides often mention cutting-edge research in deep learning. Seriously, this is what a proper college class should feel like.
Although the class has listed prerequisites, they're not enforced. ECE 133A isn't really required (I didn't take it and did just fine). ECE/CS M146 isn't really necessary either, it's just background information that's mentioned in passing during lectures (I also hadn't taken it). You really do need to take a probability class though, even if it's not ECE 131A (STATS 100A or MATH 170E, etc. will do fine) or you'll be lost in the first half of the class.
The homeworks are quite time consuming, but there were only 5. They're a mixture of written math solutions and Python coding in Jupyter notebooks. It's helpful to have some exposure to Python before the class (even better if you already have familiarity with NumPy). The homeworks are pretty well spaced out, so there's plenty of time to complete them, and the TAs provide exceptional help during discussions (seriously, don't skip discussions. The TAs practically solve homework problems sometimes). Kao gives three "late days" across all the homework, so the deadlines are a little flexible.
Instead of a final, there is a final group project where you have to apply everything you learned in the quarter to a deep learning project. Kao provides a default project (in case you aren't creative, like me). It requires a fair amount of work, but it's due before finals week, so if you start early enough it doesn't interfere with studying for other classes.
Overall, this was one of the best courses I've taken at UCLA, and Kao is one of the best professors in the ECE department. If you're at all interested in machine learning, I highly recommend you take this class before you graduate. CS majors can probably petition it to count as an elective.
The first part of this review is to the people who are considering taking this class without prior experience. I highly recommend not taking this class unless you took M146 and have some experience in machine learning, both of which I didn't do (this my fault). The course was very math heavy at the start and Kao doesn't define many of the ML terms that he already expects you to know.
Generally, the workload is also very intense and very much requires that you have an understanding of numpy (it will be extremely painful if you do not). Luckily Tonmoy was very helpful in his office hours for the homeworks, but the homeworks will generally be awful.
The class is very theory based. You learn a lot of how neural networks functions and the function of each hyperparameter, but you won't be taught much of how to use frameworks such as PyTorch or good practices for training a model.
I also personally would have changed the grading scheme a bit. 50% for midterm is a bit excessive. There was also double jeopardy on the backprop question on the midterm: if you made the same small mistake on both sides of the backprop, you would lose the points for both sides; you could lose 4% of your total grade in the class just for making a small algebra mistake. The 2% extra credit is nice though, and the project is graded very leniently.
amazing lecturer. you learn a lot. it is very difficult and a lot of work, but kao is a great professor who is very accessible and accommodating. you will come out of this with a very clear understanding of neural networks
Displaying all 3 reviews
Course
Grading Information
-
Has a group project
-
Attendance not required
-
1 midterm
-
No final
-
33% recommend the textbook
Previous Grades
Grade distributions not available.