Announcements 23SP

Warning: This content has not yet been fully revised for this year.

Projects

Text and Image Generation – short and very clear YouTube videos

Reading Recess Reminders

Midterm

Project

Final Week

Congratulations on making it to the end of the semester!

Timeline of the rest of the semester

Notes

Final Project Presentations

See you on Monday in lab!

Week 11

Sorry I didn’t get a logistics email out last week. Between traveling, advising, and our midterm, I’ve been swamped. I recognize you all may have been swamped too, so if you need leniency on any due dates, please let me know.

This week:

I’m sorry for the continued delay in getting project proposal feedback back to you, and the gradebook is still not accurate. I hope to catch up this week!

Since we’re a bit lighter this week, take the opportunity to catch up on anything you’ve missed – or revise a prior assignment for a better grade. Let me know if you choose to do that.

See you in lab on Monday!

Week 9

This week:

Reminders:

I’ll be out of town on Friday. I might be able to hold a remote session, though, so don’t cancel your plans quite yet.

Week 8

Welcome back from Spring Break! We’re starting the second half of the class(*), switching from the basics of deep learning to one of its most transformational applications: language models. Yup, this is where we learn about ChatGPT and its cousins. We’ll start with a high-level view of how language modeling works, then dig into the strange yet strangely simple Transformer family of neural architectures. We’ll also discuss image generation, human-AI interaction, transparency, and a few other topics in future weeks.

Logistics

Notes

See you Monday!

(*) It’s actually more than halfway through the semester, thanks to days off for Advising and Easter.

Week 7 mid-week

See you Friday!

Week 7 old (embeddings)

In this unit we introduce one of the most powerful concepts in machine learning: the embedding. It’s the idea that instead of being explicitly given a representation for something, we learn it from data based on the properties it should have. We’ve actually already seen embeddings of various kinds (the inputs to the last layer of an image classifier for one), but we’ll look at two examples where the embedding aspect is even clearer: movies and words.

News:

  1. In lieu of a normal Discussion this week, we get to start thinking about project ideas! I’ve put up a Forum on Moodle for this week if you’d like to pitch an idea and are looking for partners. We’ll talk more about projects in class this week; here’s the overall description.

Logistics:

  1. Prep 7 is due on Monday as usual.
  2. Also due Monday is for you to contribute the first of your Exam Questions; see the Instructions.
  3. Homework 3 is due on Friday.
  4. No discussion this week either.
  5. Lab 6 is due on Friday as usual.

See you Monday in class!

Week 7: Vision and Perspectives

This will be the last week of CS 375! Many, but not all, of you will be continuing with us after Spring Break in CS 376. I hope the course has been valuable to you so far, and that you’re excited to continue learning about machine learning.

We’ll discuss two final topics briefly this week: (1) convolutional neural networks, and (2) some broader contexts of machine learning, historically and in today’s society.

Logistics:

My wife has been sick with the flu, so I’ve been behind on things in general as I take care of her and our kids. I’m hopeful that I won’t get it myself, but if I do, you might have a guest lecture.

Hopefully I’ll see you Monday!

Week 6: Generalization and Tasks

Happy Saturday, Neural Network coders!

The last few labs were things that you can be proud of: in Lab 3 you trained a simple neural network entirely by hand; in Lab 4 you trained and evaluated a deep neural network in Keras and understood exactly why each part of the network was needed (linear layers, nonlinearities, softmax, cross-entropy loss, etc.), and in Lab 5 you upgraded the input to use images and understood exactly what it meant for a convolutional neural network to be a “feature extractor”.

This week we’ll discuss how to make sure our models can generalize to tasks that they weren’t explicitly trained on. Speaking of tasks, we’ll also spend some time thinking about how to set up neural networks to handle various kinds of tasks.

Logistics:

Want more resources to help understand what’s going on?

Week 5: Learning

Last week we studied the very basics of learning by gradient descent. We implemented one of the simplest possible models–linear regression–and saw how to fit it using gradient descent. This week we’ll see how to fit more complex models: we’ll change the loss function so that it can perform classification instead of regression, and we’ll add a nonlinearity to the model so that it can fit more complex functions. We’re well on our way to understanding how to build a neural network!

Logistics:

Notes:

Useful resources for practice:

Week 4: Models

Lab 3 was a rite of passage for an ML practitioner: you’ve trained a model by gradient descent entirely by hand, with no libraries beyond NumPy. It’s ok if you don’t quite understand everything you did the first time, but study it again a few times to make sure you understand. If you haven’t yet tried the toy2 dataset (with an outlier) to see the difference that MAE vs MSE makes, I highly encourage you to do that and think about why the results (should be) different.

But that was just linear regression. What about deep neural networks? This week we bridge the gap: linear (“dense”) layers with mulitple inputs and outputs, nonlinear layers, classification, classification losses, … and how a library like Keras lets us stay sane (e.g., not have to do backprop by hand). Hang on for the ride, and keep your favorite thing-explainer (YouTube, ChatGPT, friends, etc.) close at hand.

Logistics:

See you on Monday!

Week 2: Supervised Learning

Foremost: can anyone in the 9:15am section switch to the 11am section? The earlier section is overfull, the later one has lots of space.

This week we dive into supervised machine learning (ML): how it fits into the landscape of AI, how to set up an ML task, some simple (tree-based) methods you can use for it, and some common pitfalls. On the coding side, we’ll learn about the industry-standard scikit-learn API for machine learning and the NumPy API for computing with arrays.

In class last week we mentioned the Dan Tepfer talk in the January Series, “Algorithms as the Shapers of Music”, which is still available to watch but only through February 14.

To wrap up week 1, make sure you’ve done all of the following (all on Moodle)

Even though you might discover that you can find course materials in a few different places, always start in Moodle.

Once we get into the rhythm of the semester, Preparation activities will be due on Mondays. So Preparation 2 would be due on Monday the 22nd. But due to delays in me getting things posted and some students not even officially being in the class yet, I won’t count anything as late until Monday the 29th.

Locations will be the same as Week 1: Monday and Wednesday are in the classroom (SB 382), Friday in the Gold lab (SB 354).

I’m here for you. We’re all here for each other. May our Lord give us all strength to persevere, joy in the journey, and love for each other along the way. I’m excited to have you all in class, and I hope you’re having a restful weekend.

Forum Posts 22SP