Announcements 23SP

Projects

Text and Image Generation – short and very clear YouTube videos

Reading Recess Reminders

Midterm

Project

Final Week

Congratulations on making it to the end of the semester!

Timeline of the rest of the semester

Notes

Final Project Presentations

See you on Monday in lab!

Week 11

Sorry I didn’t get a logistics email out last week. Between traveling, advising, and our midterm, I’ve been swamped. I recognize you all may have been swamped too, so if you need leniency on any due dates, please let me know.

This week:

I’m sorry for the continued delay in getting project proposal feedback back to you, and the gradebook is still not accurate. I hope to catch up this week!

Since we’re a bit lighter this week, take the opportunity to catch up on anything you’ve missed – or revise a prior assignment for a better grade. Let me know if you choose to do that.

See you in lab on Monday!

Week 9

Welcome to week 9! This week start to take off the covers of NLP models, just like we took off the covers of image models in the first half of the class. In particular, we’ll get our first taste of the Transformer model, the most important model in machine learning today. For full disclosure, we won’t get to a lot of new content this week because I’m traveling for part of the week, but it’ll be a good chance to catch up and review.

This week:

Reminders:

I’ll be out of town on Friday. I might be able to hold a remote session, though, so don’t cancel your plans quite yet.

Week 8

Welcome back from Spring Break! We’re starting the second half of the class(*), switching from the basics of deep learning to one of its most transformational applications: language models. Yup, this is where we learn about ChatGPT and its cousins. We’ll start with a high-level view of how language modeling works, then dig into the strange yet strangely simple Transformer family of neural architectures. We’ll also discuss image generation, human-AI interaction, transparency, and a few other topics in future weeks.

Logistics

Notes

See you Monday!

(*) It’s actually more than halfway through the semester, thanks to days off for Advising and Easter.

Week 7 mid-week

See you Friday!

Week 7

Welcome to unit 7, the week before Spring Break!

In this unit we introduce one of the most powerful concepts in machine learning: the embedding. It’s the idea that instead of being explicitly given a representation for something, we learn it from data based on the properties it should have. We’ve actually already seen embeddings of various kinds (the inputs to the last layer of an image classifier for one), but we’ll look at two examples where the embedding aspect is even clearer: movies and words.

News:

  1. As usual, during/after the Lab I made some clarifications and improvements to the instructions. One that I’m particularly happy about this year is that I’ve added code to the NN Regression notebook that shows what features the model learned. Have a look at the preview of that notebook.
  2. I simplified Homework 3 a bit, to make “doing it the wrong way” an optional extension. I’ve also clarified the strategy suggestions.
  3. After Spring Break we’ll start on Transformers, which means we’ll be using a new textbook. Unfortunately this one doesn’t have a free online version. If cost is an issue, talk with me.
  4. I recorded a video walking through backpropagation last year that you might find helpful. I had it buried in a private Teams folder, but I just put it up as a playlist on YouTube.
  5. In lieu of a normal Discussion this week, we get to start thinking about project ideas! I’ve put up a Forum on Moodle for this week if you’d like to pitch an idea and are looking for partners. We’ll talk more about projects in class this week; here’s the overall description.

Logistics:

  1. Prep 7 is due on Monday as usual.
  2. Also due Monday is for you to contribute the first of your Exam Questions; see the Instructions.
  3. Homework 3 is due on Friday.
  4. No discussion this week either.
  5. Lab 6 is due on Friday as usual.
  6. If you’ve gotten behind in completing assignments, don’t wait until Spring Break to start catching up. Remember our late policy: you can get full Outcome credit for late assignments, but you lose a Process point for each business day late. So getting behind will sting, but catching up is always worth it.
  7. Back to normal schedule: Monday and Wednesday in classroom, Friday in lab.

See you Monday in class!

Week 6: Recap and Regularization

Happy Saturday, Neural Network coders!

The next unit discusses state-of-the-art models (still focusing on computer vision). We’ll introduce or revisit tools that allow our models to achieve high performance, such as data augmentation and regularization. But we’ll actually spend most of our time together getting more practice with how neural networks work from the ground up as we implement our own simple neural net image classifier from scratch (definitely something to mention in an interview!).

Logistics:

I’ve started working on making sure that the Moodle Calendar is up to date. I’ll try to keep it that way, but if you notice something missing, please let me know.

Week 5: Learning

Last week we studied the very basics of learning by gradient descent. We implemented one of the simplest possible models–linear regression–and saw how to fit it using gradient descent. This week we’ll see how to fit more complex models: we’ll change the loss function so that it can perform classification instead of regression, and we’ll add a nonlinearity to the model so that it can fit more complex functions. We’re well on our way to understanding how to build a neural network!

Logistics:

Notes:

If last week’s material was unclear for you, I recommend going over the Wednesday activity again; I’ve made some additions to it to help clarify some of the concepts. I also made some improvements to the Fundamentals notebooks to help explain things better, so if you’re still having trouble with those, you might want to download the new versions. (The old version is just fine too.)

I’m working on clarifying the Moodle calendar and grading. You probably got a notification about grades for Labs 1 and 2. If you didn’t realize that labs have the Moodle check-in quizzes, I’ll allow grace for completing those through this week. The course grades in Moodle are meaningless right now, but I’ll be updating them soon.

See you on Monday! As usual, we’ll be in the classroom, reviewing last week’s material and starting on this week’s. Come with questions about the reading or video-watching!

Week 4: Models

This week we’re pulling off the covers to see how these machine learning models actually work. Hang on for the ride, and keep your favorite thing-explainer (ChatGPT, YouTube, etc.) close at hand.

I forgot something about loss curves on Friday, when mentioning that training loss was higher than validation loss. Training loss gets computed during training, and Dropout is active during training. We’ll talk more about that in a couple of weeks, but for now, the intuition is that the training process intentionally handicaps the model to avoid overfitting.

See you Monday! Remember to come with questions about the reading or video-watching.

Week 2: Data

I’m excited to have you all in class. The semester start was a bit bumpy for various reasons, but I’m hoping we can settle into a rhythm soon.

Wrapping up Week 1

Make sure you’ve done all of the following (check them off in Moodle when you have):

I’ve set suggested due dates for everything in Moodle; we can continue to discuss the best policy here together. One issue is that since Moodle quizzes lock when they’re closed, I’m using the “Expect Completion By” dates for those activities. Those should show up in your Moodle Calendar, but might not show up on the activity itself. I’m looking into this.

Preparing for Week 2

In week 2 we will start considering what data you need to train models. You’ll collect some image data yourself and train and evaluate a model. We’ll also start looking at some of the ethics of collecting data for AI; we will return to this discussion throughout the semester.

I’ll post the following shortly; they won’t be due till the following week:

Locations will be the same as Week 1: Monday and Wednesday are in the classroom (NH 253), Friday in the Gold lab.

I’m here for you. We’re all here for each other. May our Lord give us all strength to persevere, joy in the journey, and love for each other along the way.

Exams by Us