Many people say a computer can only do
what it's been told to do. Well, it's true, it may start off that
way, but it is only the start. A computer can be made to learn.
— attributed to A. Turing,
Breaking the Code, 1986.
Google’s Machine Learning Crash Course
-
Descending into ML
- Explain the nature of a linear model.
- Compare and contrast L2 Loss vs.
Mean Square Error (MSE).
-
Reducing Loss
- Explain the nature of (stochastic) gradient
descent.
- Terms:
- Learning rate
- Hyper-parameter
- Batch & Mini-batch
-
First Steps with
TensorFlow
- Do you believe that TensorFlow “can be used to
encode anything you can imagine” (see the video)?
- Compare and contrast tf.estimator vs.
SciKit-Learn.
- What is a tensor?
Review the programming exercises for Pandas, TensorFlow and
Synthetic Features but save the actual programming “exercises”
for the lab.
-
Generalization
- Terms:
- Occam’s razor
- IID
- Stationarity
- Compare and contrast training vs.
testing sets.
-
Training and Test Sets
- Should we randomize our examples before splitting the
train/set sets? If so, why; if not, why not?
Programming Tools
-
Pandas