Deep neural networks are responsible for some of the greatest advances in modern computer science, … . At the heart of this progress are fundamental techniques developed starting more than 30 years ago by this year’s Turing Award winners, Yoshua Bengio, Geoffrey Hinton, and Yann LeCun. By dramatically improving the ability of computers to make sense of the world, deep neural networks are changing not just the field of computing, but nearly every field of science and human endeavor. — J. Dean, Google Senior Fellow , Fathers of the Deep Learning Revolution Receive ACM A.M. Turing Award
  1. Google’s Machine Learning Crash Course

    1. Neural Networks
      1. Terms
        • Neurons
        • Hidden layers
        • Activation function (e.g., Sigmoid & RelU)
      2. Compare and contrast handling non-linearities using feature crosses vs. neural networks.
      3. How does a neural network model non-linearities?
    2. Training Neural Networks
      1. Give a general explanation of how back-propagation works. Nb., the “Back-propagation algorithm visual explanation” webpage is particularly useful here.
      2. Terms
        • Vanishing/exploding gradients
        • Dead ReLUs
        • Dropout
    3. Multi-Class Neural Networks
      1. Terms
        • One-vs-All
        • Softmax
        • Logits
      2. Does the softmax layer have to have the same number of nodes as the output layer? If so, why; if not, why not?
  2. Google’s ML Practicum: Image Classification — Study the first three sections: “Introduction”–“Check Your Understanding”.

    1. Why doesn’t a simple network like the one we used for the MNIST dataset work in general?
    2. Terms
      1. Convolution
      2. Convolved Feature
      3. Pooling

    You will complete some of the exercises as part of the lab.