In this unit, after reviewing where we’ve been, we push towards state-of-the-art models (still focusing on computer vision). We’ll first show how our work last 2 weeks connects to the pre-trained models we used in the opening weeks. Then, we’ll introduce or revisit tools that allow our models to achieve high performance, such as data augmentation and regularization. Finally, we’ll get more practice with how neural networks work from the ground up as we implement our own simple neural net image classifier from scratch (definitely something to mention in an interview!). Students who complete this unit will demonstrate that they can:
The fastai course videos are still a bit disorganized, sorry about that.
Strategies for getting state-of-the-art performance:
We find that our neural networks achieve similar performances as pre-trained DNNs, even though they consist of far fewer parameters and do not rely on third-party datasets.
We’ll be doing some automatic differentiation this week:
autograd-for-dummies: A minimal autograd engine and neural network library for machine learning students.Finally, I sometimes remark that “machine learning is lazy” (in that it tends to focus on superficial easy features). Here’s a more precise statement of a related claim: What do deep networks learn and when do they learn it
Review: