Softmax

Background

Jargon:

Warm-Up Activity

Open the softmax and cross-entropy interactive demo that Prof Arnold created.

Try adjusting the logits (the inputs to softmax) and get a sense for how the outputs change. Describe the outputs when:

  1. All of the inputs are the same value. (Does it matter what the value is?)
  2. One input is much bigger than the others.
  3. One input is much smaller than the others.

Finally, describe the input that gives the largest possible value for output 1.

Notebooks

Softmax, part 1 (name: u04n2-softmax.ipynb; show preview, open in Colab)

PyTorch and Logistic Regression
From Logistic Regression to the Multi-Layer Perceptron