For this homework, do the following things:

  1. Speculate on whether you believe that so-called “deep” neural networks are destined to be another bust just as perceptrons and expert systems were in the past, or whether they really are a breakthrough that will be used for years into the future. Please give a two-to-three-paragraph answer, including examples to back up your argument.

  2. Hand-compute a single, complete back-propagation cycle. Use the example network from class and compute the updated weight values for the first gradient descent iteration for the XOR example, i.e., [1, 1]0. Use the same initial weights we used in the class example but assume the identity function as the activation function (f(x) = x).

  3. Build a Keras-based ConvNet for Keras’s Fashion MNIST dataset (fashion_mnist). Experiment with different network architectures, submit your most performant network, and report the results.

Checking in

Submit a Jupyter notebook (homework4.ipynb). We will grade your work according to the following criteria:

See the policies page for homework due-dates and times.