You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: index.md
+13-1Lines changed: 13 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,16 @@ nav_exclude: true
10
10
11
11
## Course Description
12
12
13
-
This class covers theoretical foundations, algorithms, methodologies, and applications for machine learning. Topics may include supervised methods for regression and classication (linear models, trees, neural networks, ensemble methods, instance-based methods); generative and discriminative probabilistic models; deep learning models including CNNs, Transformers, graph neural networks for vision and language tasks; and Markovian models for reinforcement learning and robotics.
13
+
14
+
Machine learning is at the core of modern artificial intelligence, transforming how we approach problems in vision, language, robotics, recommendation systems, and countless other areas. This course introduces the theoretical foundations, algorithms, and applications of machine learning, combining mathematical rigor with practical experience.
15
+
16
+
Throughout the semester, we will explore the full pipeline of machine learning from problem formulation and working with data, to designing models and optimizing them. We will begin with a discussion of what machine learning is, how problems are framed and categorized, and the taxonomy of common learning paradigms. We will review the mathematical background necessary for the course, including probability and optimization concepts.
17
+
18
+
We will then study unsupervised learning methods, including clustering with k-means and Expectation–Maximization, and dimensionality reduction with Principal Components Analysis (PCA). We will discuss regression in detail, starting with objective functions and then moving to linear regression and its Maximum Likelihood Estimation (MLE) interpretation, and then exploring how regression connects to classification.
19
+
20
+
The deep learning portion of the course will begin with neural network fundamentals, including building non-linear models, choosing architectures and activation functions, and defining loss functions. We will cover PyTorch implementations, backpropagation, batch normalization, initialization strategies, and regularization. We will then survey other deep architectures such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and transformers — along with generative models such as Large Language Models (LLMs), autoencoders, and Generative Adversarial Networks (GANs).
21
+
22
+
Finally, the course will include guest lectures from leading researchers, discussions of emerging topics in the field, and opportunities to connect theoretical concepts to cutting-edge research and applications.
14
23
15
24
16
25
<!-- TODO: Add previous course offerings. (We can't do this the way Data100 has been. I know how to link all the Spring versions, but I'd have to hunt down Fall versions.) -->
@@ -42,17 +51,20 @@ This class covers theoretical foundations, algorithms, methodologies, and applic
42
51
43
52
### Goals
44
53
54
+
45
55
- Provide a rigorous foundation in the mathematics, algorithms, and concepts of machine learning.
46
56
- Prepare students for advanced coursework and research in artificial intelligence, deep learning, computer vision, and natural language processing.
47
57
- Enable students to implement machine learning algorithms and apply them to real-world problems.
48
58
49
59
50
60
### Prerequisites
51
61
62
+
52
63
This course assumes strong preparation in mathematics and programming. The required prerequisites are:
53
64
54
65
-**Multivariable Calculus**: MATH 53
55
66
-**Linear Algebra**: MATH 54 or equivalent
56
67
-**Probability and Discrete Mathematics**: COMPSCI 70 or equivalent
57
68
69
+
58
70
You should be comfortable with vector calculus (including gradients and the multivariate chain rule), matrix operations, probability theory (including conditional probability and Bayes’ rule), and writing/debugging complex programs in Python. If you lack preparation in these areas, you are likely to struggle.
0 commit comments