Unit 6: Recap and Regularization

In this unit, after reviewing where we’ve been, we push towards state-of-the-art models (still focusing on computer vision). We’ll first show how our work last 2 weeks connects to the pre-trained models we used in the opening weeks. Then, we’ll introduce or revisit tools that allow our models to achieve high performance, such as data augmentation and regularization. Finally, we’ll get more practice with how neural networks work from the ground up as we implement our own simple neural net image classifier from scratch (definitely something to mention in an interview!). Students who complete this unit will demonstrate that they can:

Preparation

The fastai course videos are still a bit disorganized, sorry about that.

Supplemental Materials

Strategies for getting state-of-the-art performance:

We’ll be doing some automatic differentiation this week:

Finally, I sometimes remark that “machine learning is lazy” (in that it tends to focus on superficial easy features). Here’s a more precise statement of a related claim: What do deep networks learn and when do they learn it

Class Meetings

Monday

Review:

Wednesday

Friday

Contents

Due this Week