Schedule - CS375

See also: CS 376 Schedule

Any content in the future should be considered tentative and subject to change.

Week 1: Introduction

Getting started with ML: impacts of AI, running Python in notebooks, training an image classifier using off-the-shelf code.

Key Questions
  • What is the essence of modern approaches to AI?
  • What optimization games are AI systems playing?
  • Can AI systems be smarter than humans?
Objectives
  • Describe the goals of artificial intelligence and machine learning
  • Describe how learning-based AI learns from data, in contrast with rule-based (symbolic) AI
  • [OG-ProblemFraming-Paradigms]: Contrast supervised learning, self-supervised learning, and reinforcement learning
  • Write and execute basic Python code using Jupyter Notebooks

Wednesday

  • Welcome discussion: hopes and concerns
  • Course logistics
    • Assessments: skills, effort, and community
    • Weekly journals, quizzes every other Friday
    • Perusall
  • Slides: Welcome to CS 375
    • My story and stance:
      • how God brought me to learn about ML/AI
      • how it’s a gift that will definitely be in the new creation but we abuse it
    • We need to work to discern AI together.
      • Importance
        • Divisiveness
        • Economic impacts
        • Existential angst
        • Identity, desires, and relationships
        • You need to be able to discern it fundamentally, not just from external behavior
      • This class:
        • This class will be about how it works at a fundamental level and what that fundamental understanding helps us understand about how it fits into God’s story
    • Tweakable Machines playing Optimization Games
      • board games
      • hook-the-human games
      • predict protein folding, guess the weather, design a molecule, …
      • imitation games: mimicking decisions, conversations, images, …
      • exploration games: control a robot, …
    • Problem framing
      • programmed vs learned
      • supervised learning: mimicry
      • self-supervised learning: reducing surprise
      • reinforcement learning: learning by trial and error

Week 2: Array Programming & Regression

Introduction to numerical computing with NumPy/PyTorch: element-wise operations, reductions, dot products, MSE. First taste of sklearn regression.

Key Questions
  • How do we represent data as arrays/tensors?
  • What is a dot product and how is it used in ML?
  • What does it mean to “fit” a model?
Objectives

This week we’ll make progress towards the following objectives:

  • [TM-TensorOps]: Implement basic array-computing operations (element-wise operations, reductions, dot products)
  • [OG-LossFunctions]: Compute MSE loss
  • [OG-ProblemFraming-Paradigms]: Contrast different types of learning machines (supervised learning, unsupervised learning, RL)
  • If you didn’t take DATA 202: use the sklearn API for basic regression tasks
Resources

Additionally, you may find these interactive articles helpful (by Amazon’s Machine Learning team):

Monday

  • Assumptions of AI: Experience (“IID” amnesia vs continual life; our mistakes matter but Jesus gives us grace)
  • Handout: Lab 1 review, intro to dot product
  • Lab 1 review
  • Intro to dot product

Wednesday

Friday

Week 3: Linear Models for Regression and Classification (and LLM APIs)

Linear regression and classification from the ground up. Introduction to classification models and metrics. If time: Using LLM APIs to build AI-powered applications.

Key Questions
  • How is linear regression an optimization game played by a tuneable machine?
  • How do we call an LLM API?
  • How do we evaluate a classification model?
Objectives

Monday

  • Handout: PyTorch, dot products, regression metrics
  • Assumptions of AI: What’s the objective?
    • ML: optimize single numbers at huge scale
    • Reality:
      • " The thief comes only to steal and kill and destroy; I have come that they may have life, and have it to the full." (John 10:10)
      • the objective is life
        • Many wise paths
        • passing on good to children (unbounded richness)
  • Logistics:
    • Homework 1
    • Journals
    • Quiz opportunity on Wednesday
  • Slides: CS 375 Week 3
  • Lab recap: PyTorch (and sklearn notebooks)

Wednesday

Friday

Week 4: Multi-input Models & Softmax

Extending linear models to multiple inputs. Understanding softmax and cross-entropy loss.

Key Questions
  • How does linear regression extend to multiple input features?
  • What is softmax and why do we use it for classification?
  • What is cross-entropy loss?
Objectives
  • [TM-TensorOps]: Work with multi-dimensional tensors, predict shapes of matrix operations
  • [TM-DataFlow]: Trace data shapes through a multi-input linear model
  • [TM-Softmax]: Implement softmax and explain why it produces a valid probability distribution
  • [OG-LossFunctions]: Describe and compute cross-entropy loss

Monday

Wednesday

Friday

Week 5: Features & MLP Architecture

Understanding feature extraction with ReLU. Introduction to classifier heads and bodies. The multi-layer perceptron (MLP) architecture.

Key Questions
  • Why are good features important for neural networks?
  • What is a classifier “head” vs “body”?
  • How does ReLU create useful features?
Objectives

Monday

  • Feature extractors intro
  • ReLU features intro
  • Classifier head and body intro

Wednesday

Friday

  • Preview of learning by gradient descent
  • Review day: gradient game
  • Tech presentation

Week 6: Gradient Descent & Generalization

Learning by gradient descent. Understanding why generalization matters and how to measure/improve it.

Key Questions
  • How does gradient descent work?
  • What is overfitting vs underfitting?
  • How can data augmentation help generalization?
Objectives

Monday

Wednesday

Friday

  • Will It Generalize? slides
  • Data augmentation notebook
  • Tech presentation

Week 7: Embeddings & RL

Embeddings as the data structures of neural computation. Introduction to reinforcement learning.

Key Questions
  • What are embeddings and how are they used in ML?
  • How does reinforcement learning differ from supervised learning?
  • What is the difference between learning to mimic vs learning by exploring?
Objectives

Monday

Wednesday

Friday

  • Slides: CS 375: Wrap-Up
  • Learning to Mimic vs Learning by Exploring
  • Course wrap-up
Schedule - CS376
Resources