Many people say a computer can only do what it's been told to do. Well, it's true, it may start off that way, but it is only the start. A computer can be made to learn. — attributed to A. Turing, Breaking the Code, 1986.
  1. Google’s Machine Learning Crash Course

    1. Descending into ML
      1. Explain the nature of a linear model.
      2. Compare and contrast L2 Loss vs. Mean Square Error (MSE).
    2. Reducing Loss
      1. Explain the nature of (stochastic) gradient descent.
      2. Terms:
        • Learning rate
        • Hyper-parameter
        • Batch & Mini-batch
    3. First Steps with TensorFlow
      1. Do you believe that TensorFlow “can be used to encode anything you can imagine” (see the video)?
      2. Compare and contrast tf.estimator vs. SciKit-Learn.
      3. What is a tensor?
      Review the programming exercises for Pandas, TensorFlow and Synthetic Features but save the actual programming “exercises” for the lab.
    4. Generalization
      1. Terms:
        • Occam’s razor
        • IID
        • Stationarity
      2. Compare and contrast training vs. testing sets.
    5. Training and Test Sets
      1. Should we randomize our examples before splitting the train/set sets? If so, why; if not, why not?
  2. Programming Tools

    1. Pandas