
Fall 24 Midterm Review
Shen Shen
October 18, 2024
Intro to Machine Learning

Outline
- Rundown
- Past Exams
- Q&A
Week 1 - IntroML
- Terminologies
- Training, validation, testing
- Identifying overfitting and underfitting
- Concrete processes
- Learning algorithm
- Validation and Cross-validation
- Concept of hyperparameter
Week 2 - Regression
- Problem Setup
- Analytical solution formula θ∗=(X~⊤X~)−1X~⊤Y~ (and what's X~)
- When X~⊤X~ not invertible (optimal solutions still exist; just not via the "formula")
- Practically (two scenarios)
- Visually (obj fun no longer of "bowl" shape, instead has "half-pipe" shape)
- Mathematically (loss of solution uniqueness)
- Regularization
- Motivation, how to, when to
Week 3 - Gradient Descent
- The gradient vector (both analytically and conceptually)
- The gradient-descent algorithm and the key update formula
- (Convex + small-enough step-size + gradient descent + global min exists + run long enough) guarantee convergence to a global min
- What happens when any of these conditions is violated
- How does the stochastic variant differ (Set up, run-time behavior, and conclusion)
Week 4 - Classification
- (Binary) linear classifier (sign based)
- (Binary) Logistic classifiers (sigmoid, NLL loss)
- Linear separator (the equation form, visual form with normal vector)
- Linear separability (interplay with features)
- How to handle multiple classes
- Softmax generalization (Softmax, cross-entropy)
- Multiple sigmoids
- One-vs-one, one-vs-all
Week 5 - Features
- Feature transformations
- Apply a fixed feature transformation
- Hand-design feature transformation (e.g. towards getting linear separability)
- Interplay between the number of features, the quality of features, and the quality of learning algorithms
- Feature encoding
- One-hot, thermometer, factored, numerical, standardization
- When and why to use any of those
Week 6 - Neural Networks
- Forward-pass (for evaluation)
- Backward-pass (via backpropogation, for optimization)
- Source of expressiveness
- Output layer design
- dimension, activation, loss
- Hand-designing weights
- to match some given function form
- achieve some goal (e.g. separate a given data set)
import random
terms= ["spring2024",
"fall2023",
"spring2023",
"fall2022",
"spring2022",
"fall2021",
"fall2019",
"spring2019",
"fall2018"]
qunums = range(1,10)
base_URL = "https://introml.mit.edu/_static/fall24/midterm/review/midterm-"
term = random.choice(terms)
num = random.choice(qunums)
print("term:", term)
print("question number:", num)
print(f"Link: {base_URL+term}.pdf")
- Past exams
General problem-solving tips




More detailed CliffsNotes
General exam tips
- Arrive 5min early to get settled in.
- Bring a pencil (and eraser), a watch, and some water.
- Look over whole exam and strategize for the order you do problems.
Good luck!

introml-fa24-midterm-review
By Shen Shen
introml-fa24-midterm-review
- 86