In this unit we extend our modeling skills to encompass classification models, and start to build the tools that will let us represent complex functions by using hidden layers. Both of these objectives require us to learn about nonlinear operations. We’ll focus on the two most commonly used ones: the softmax operator (which converts scores to probabilities) and the rectifier (“ReLU”, which clips negative values).
Students who complete this unit will demonstrate that they can:
The fastai course videos are a bit disorganized here, sorry about that.
We’re using Elo scores for intuition a few times this week, but we’re intentionally not diving deep on it. If you do want to dive deep:
Deuteronomy 22:4: responsibility for what we build. Resilience.
Discuss: Explain “gradient” in the context of last week’s activity and lab.
Where we are: review last week’s learning objectives
Where a linear layer fits in a neural net
Introduction to classification
plot_top_losses output: probs and losess