- Time and Place: Wednesdays 8-10am, online over zoom (previously: CAB G 51)
- Starting March 18, lectures will also be recorded and posted online
- Instructor: Fanny Yang
- Teaching assistants:
- Alexandru Tifrea (tifreaa at inf.ethz.ch), Amir Joudaki (ajoudaki at inf.ethz.ch)
- Office hours: Monday, 4-5 pm, online over zoom upon request via email (previously: CAB G 69.3)

- Sign up on waitlist until
**March 1st**, fill out form - De-register until
**March 13th**

- Piazza (Announcements, Discussion of homeworks and classes)
- gradescope enroll with entry code 975DPG (homeworks)
- eduApp online (in-class check-ins and feedback)

This course is designed to prepare Master students for successful research in ML, and prepare PhD students to find new research ideas related to ML theory. Content wise, the technical part will focus on generalization bounds using uniform convergence, and non-parametric regression.

By the end of the course

- both easily read and write theorems that provide generalization guarantees for machine learning algorithms
- find high-impact questions and theorems to prove and work on that you are highly passionate about

**How to get there**

- understand and apply known and new generalization concepts for ML guarantees
- critically read papers by assessing strengths and weaknesses using clear evidence, conjecture and prove new research questions that are impactful
- communicate own ideas, results and knowledge efficiently in a paper and presentation
- Learn to collaborate

- 10% HW, 50% oral midterm, 40% project
- Homework:
- some graded homework problems
- rest is self-graded (mandatory hand-in)

- Project report and presentation: see project website
- Presence is mandatory in the last four weeks of classes during presentations

Homeworks are designed to

- do some technical (“just algebra”) work that needs to be practiced individually
- learn how to read more material on the matter effectively (
**homework content will be part of the midterm exam!**)

No late homework

Each homework write-up must be neatly typeset as a PDF document using TeX, LaTeX, or similar systems (for more details see below). This is for you to practice getting efficient at it. Ensure that the following appear on the first page of the write-up:

- your name,
- your Student ID, and
- the names and IDs of any students with whom you discussed the assignment.

Submit your write-up,

**one page per question**, as a single PDF file by 11:59 PM of the specified due date to gradescope. Follow the instructions and mark the pages that belong to the corresponding questions. See more details on the homework sheet.Some questions will be graded by the TAs. All questions will be self-graded by you.

Discussions on piazza

As graduates students we expect you to take this class because you want to learn the material and how to do research. All assessments are designed to maximize the learning effect. Cheating will harm yourself and hence it is of your own interest to adhere to the following policy.

All homework is submitted individually, and must be in your own words.

For homeworks 1-2, you may discuss only at a high level with up to three classmates; please list their IDs on the first page of your homework. Everyone must still submit an individual writeup, and yours must be in your own words; indeed, your discussions with classmates should be too high level for it to be possible that they are not in your own words.

We prefer you do not dig around for homework solutions; if you do rely upon external resources, cite them, and still write your solutions in your own words.

When integrity violations are found, they will be submitted to the department’s evaluation board.

- Subject to frequent changes, check back often!
- Assignments are released and due on the
**Friday**of the week, if not specified otherwise - The slides are not shown as is during lecture, but they contain a superset of the content of each lecture

Links to books are online resources free from the ETH Zurich network

**Learning Theory**

Martin Wainwright: High-dimensional statistics (core reference for the course)

Percy Liang: Statistical Learning Theory, Stanford Lecture notes

**Some more background reading for your general wisdom, knowledge and entertainment**

Keener: Theoretical Statistics: e.g. asymptotic optimality (MLE), UMVU testing

Steinwart and Christmann: Support Vector Machines: more mathematical treatment of RKHS

van der Vaart and Wellner: Weak Convergence and Empirical Processes