Probability Theory Autumn 2023

Igor Kortchemski
Robert A. Crowell
Ritvik Radhakrishnan
Course Catalogue


Measure theory formalism and probability theory, Dynkin's lemma and independence, convergence of series of independent random variables, law of large numbers, conditional expectation, martingale convergence theorems, uniform integrability, optional stopping theorem for martingales, the Bienaymé-Galton-Watson process and its R-number, convergence in law and the central limit theorem.


Lectures will take place in-person at ML H 44 on Thursdays and Fridays from 08:15 to 10:00.

Video recordings

Lectures will be recorded and uploaded to the ETH Video Portal (the use of full-screen mode with 1080p quality is recommended for optimal readability of the blackboard).

Lecture Notes

The notes are copyright protected and may not be circulated without permission.

Recaps (click to expand)
  • Recap 1: sets
  • Recap 2: probability spaces
  • Recap 3: countable sets
  • Recap 4: topology (ℝ^n, induced topology and induced σ-fields)
  • Recap 5: Dictionary between probabilistic and set vocabularies
  • Recap 6: random variables
  • Recap 7: convergence theorems
  • Recap 8: classical laws
  • Recap 9: convergence of random variables
  • Recap 10: conditional expectations
  • Recap 11: approximation toolbox
  • Recap 12: (sub/super)martingales (updated 7/12 with addition of "Terminology" part)
  • Recap 13: convergence in distribution, characteristic functions
  • Handwritten lecture notes, which correspond essentially to what is written on the blackboard during the lectures (large files):

    Notes from the lectures (click to expand)
  • Motivation and presentation slides
  • Chapter 0: discrete probability (slides)
  • Chapter 1: σ-fields, measures (updated 19/10 : equivalence turned into an implication in Proposition p14)
  • Chapter 2: random variables (updated 26/10: remarks added top of pages 23 and 24)
  • Chapter 3: sequences and series of independent random variables (updated 9/11: added a.s. uniqueness of the limit for convergence in probability pages 9/10 as seen in class on November 9th.)
  • Chapter 4 part 1: conditional expectation (updated 10/11; proof of conditional Jensen's inequality corrected)
  • Chapter 4 part 2: (sub/super)martingales and their a.s. convergence (updated 17/11)
  • Chapter 5: uniformly integrable martingales, stopping times (updated 24/11)
  • Chapter 6: L^p martingales, p>1. (updated 1/12)
  • Chapter 7: convergence in distribution. (updated 14/12: proof of step 1 page 6 updated and detailed). Section 6 on Gaussian vectors is not part of the exam.
  • Chapter 8: a glimpse on statistical theory. (updates 22/12). Not part of the exam.
  • The Brownian Universe(slides of the last lecture). Not part of the exam.
  • If you spot any typos or mistakes, please send an email to

    Here are typed notes if you are more comfortable with them.

    Typed notes (click to expand)

    To access the typed lecture notes you must have an active enrollment for the lecture via mystudies, please log in using your nETHz credentials:

  • Lecture Notes (Part I, Ch. 1-6)
  • Lecture Notes (Part II, Ch. 7-9)
  • We shall cover essentially the same material than in these typed notes (the order will sometimes change a bit).

    Numerical simulations in Probability using Python

    It is interesting and useful to know how to do numerical probabilistic simulations. Here is some material, using the Python coding language, for students who are interested in that (note that this is completely optional: this is made available for autonomous study and will not be discussed in class; in particular none of this will be part of the exam).

    The following documents are Jupyter notebook files (save them on your computer, and open them from the Jupyter Notebook). For installation, see the jupyter webpage or install Anaconda.

  • Python tutorial for probabilists
  • Python for probabilists : practical work
  • Python for probabilists : practical work (with solutions)
  • Diary of the lectures

    The references correspond to the handwritten lecture notes.

    Detailed lecture overview (click to expand)
    no. date content references
    1 21.09.23 Motivation and presentation of the course.
    Why are you interested in taking this course?
    Why do we need measure theory in probability theory?
    Discrete probability warm-up. σ-fields, generated sigma fields.       
    • Motivation and presentation slides
    • Chapter 0 slides
    • Chapter 1, pages 1-4 
    2 22.09.23 Examples of σ-fields (cylinder σ-field, Borel σ-field, product σ-field), measures, Dynkin systems. • Chapter 1, pages 4-8
    3 28.09.23 The Dynkin lemma,independance of a finite number of events • Chapter 1, pages 9-13
    4 29.09.23 Independence of σ-fields, coalition principle, independence of any family of σ-fields and events, Borel-Cantelli lemmas, measurable functions, image measure, law of a random variable • Chapter 1, pages 14-16
    • Chapter 2, pages 1-2
    5 5.10.23 Cumulative distribution functions (cdf), σ-fields generated by functions, product σ-fields; • Chapter 2, pages 2-6
    6 6.10.23 Product measure, independence of random variables • Chapter 2, pages 6-9
    7 12.10.23 Tail σ-field, Kolmogorov 0-1 law, real-valued random variables, Doob-Dynkin Lemma
    • Chapter 2, pages 10-13
    8 13.10.23 Integration with respect to a measure, integration theorems: monotone converge, Fatou, Fubini-Tonnelli, dominated convergence, Fubini-Lebesgue (no proofs), classical discrete laws. • Chapter 2, pages 13-21
    9 19.10.23 Random variables with densities, classical continuous laws, transfer theorem, dummy function method • Chapter 2, pages 21-24
    10 20.10.23 Densities and independence, integration and independence, variance, the use of Borel-Cantelli for almost sure convergence • Chapter 2, pages 25-28
    • Chapter 3, pages 1-2
    11 26.10.23 L^4 version of the Strong Law of Large numbers, Kolmogorov's maximal inequality • Chapter 3, pages 2-5
    12 27.10.23 Kolmogorov two series theorem, Kolmogorov three series theorem, strong law of large numbers via Kronecker's lemma. • Chapter 3, pages 5-9
    13 2.11.23 Different notions of convergence: convergence in probability, convergence in L^p, Bienaymé-Tchebychev inequality, subsequence Lemma for convergence in probability, flying saucepan example, spiky cat example.      • Chapter 3, pages 9-12
    14 3.11.23 Uniform integrability, epsilon-delta condition, "super dominated convergence" theorem. Existence of a sequence of i.i.d. random variables not covered during the lecture.     Conditional expectation ( discrete setting).   • Chapter 3, pages 12-14
    • Chapter 4 part 1, pages 1-3
    15 9.11.23 Uniqueness of limits for convergence in probability,  definition of conditional expectation (characteristic property), first properties. • Chapter 2, pages 3-6
    • Chapter 4 part 1, pages 3-6
    16 10.11.23 Proof of existence and uniqueness of conditional expectations. Conditional expectation of [0,∞]-valued random variables. Conditional convergence theorems (conditional monotone convergence, conditional Fatou, conditional Dominated Convergence, conditional Jensen).   • Chapter 4 part 1, pages 6-10
    17 16.11.23 Further properties of conditional expectation (tower property, addition of independent information), conditional densities not covered in class. Definitions of (sub/super)martingales, first properties. • Chapter 4 part 1, pages 10-11
    • Chapter 4 part 2, pages 1-3
    18 18.11.23 Discrete stochastic calculus, the (sub/super)martingale a.s. convergence theorem through Doob's upcrossing Lemma, definition of the Bienaymé-Galton-Watson process • Chapter 4 part 2, pages 4-7
    19 23.11.23 Asymptotic behavior of the Bienaymé-Galton-Watson process using martingales. L^1 version of the strong law of large numbers. Characterization of martingales that converge in L^1.
    • Chapter 4 part 2, pages 7-9
    • Chapter 5, pages 1-2
    20 24.11.23 A.s. and L^1 convergence of closed martingales, stopping times, stopped σ-field, optional stopping theorem • Chapter 5, pages 3-6
    21 30.11.23 Application of optional stopping to the study of random walk (with the quadratic and exponential martingales). Statement of Doob's maximal inequalities. • Chapter 5, pages 6-8
    • Chapter 6, page 1
    22 1.12.23 Proof of Doob's maximal inequalities, Doob L^p inequalities, L^p bounded martingales. Definition of convergence in distribution and first remarks. • Chapter 6, pages 1-4
    • Chapter 7, pages 1-2
    23 7.12.23 Examples of random variables converging in distribution, uniqueness of the limit in law, continuous mapping, convergence in distribution is implies by a.s., in P, in L^p convergences. Portemanteau theorem (difficult part (5) implies (6) not covered in class) • Chapter 7, pages 2-5
    24 8.12.23 Probabilistic reformulation of Portemanteau theorem, extended continuous mapping, characterization of convergence in distribution using cdf's, Slutsky's theorem, restricting the class of test functions to continuous functions with compact support. • Chapter 7, pages 5-8
    25 14.12.23 Convergence in distribution for integer-valued random variables, characteristic functions and first properties,    characteristic function of a Gaussian r.v., characteristic functions characterize laws.       • Chapter 7, pages 8-11
    26 15.12.23 Consequences of characterization of laws using characteristic functions. Lévy's theorem, central limit theorem. • Chapter 7, pages 11-14
    27 21.12.23 Gaussian vectors, a glimpse on statistical theory (estimators). • Chapter 7, pages 15-18
    • Chapter 8, pages 1-2
    28 22.12.23 A glimpse on statistical theory (confidence intervals), overview of simulations of strong law of large numbers and central limit theorem in Python, the Brownian Universe. •Chapter 8, pages 2-3


    While handing in your solutions is not compulsory, experience shows that being able to solve the exercises independently goes a long way towards good exam performance. We strongly encourage you to hand in written solutions. This can be done electronically. Please follow the instructions below.

    1. Make sure you're registered for one of the problem classes in myStudies.
    2. Scan your solution into a single PDF file.
    3. Name the created file as {email address}_{exercise sheet number}.pdf (e.g. andrey.kolmogorov@probability.com_1.pdf).
    4. Upload your solution using the link in the table below (please submit to the problem class you registered for) before 17:00 on the indicated Monday.

    Late submissions will not be considered. In case you submit your solution early and then decide to change something and submit again, we will only consider the most recent submission.

    Note that you can submit at most two exercises (Exercise 1 and/or an exercise from the Training exercises).

    Exercise sheets (click to expand)
    exercise sheet due by solutions
    Exercise sheet 1 (updated 21/09 20:30) September 25 Solutions
    Exercise sheet 2 October 2 Solutions (updated 03/10 23:15)
    Exercise sheet 3 October 9 Solutions
    Exercise sheet 4 (updated 13/10 15:26) October 16 Solutions (added 17/10 14:00)
    Exercise sheet 5 October 23 Solutions (updated 24/10 8:30)
    Exercise sheet 6 October 30 Solutions (updated 31/10 10:00)
    Exercise sheet 7 November 6 Solutions (Updated 9/11: two other solutions for Exercise 1 (5) added)
    Exercise sheet 8 November 13 Solutions (added 13/11 17:30)
    Exercise sheet 9 November 20 Solutions (added 20/11 17:30)
    Exercise sheet 10 (updated 27/11 16:30: in Exercise 3 (1), the assumption "|Xn| ≤ Y" should be present) November 27 Solutions (added 28/11 13:30)
    Exercise sheet 11 December 4 Solutions (added 4/12 17:30)
    Exercise sheet 12 December 11 Solutions (added 11/12 17:00)
    Exercise sheet 13 December 18 Solutions (added 20/12 13:00)

    Exercise classes

    Exercises classes start the second week of the semester.

    time room assistant submit solution
    Tu 14-15 ML F 34 Nicolas Hotton submit solution
    Tu 14-15 ML H 41.1 Robert Crowell/Ritvik Radhakrishnan submit solution
    Tu 15-16 ML H 41.1 Zifan Lyu submit solution

    Office hours & Forum


    You will find the forum here. It is intended to be collaborative, i.e. for students to help students. The assistants for the course and lecturer may moderate and answer more advanced questions. Note that the forum will be closed after Fri. 19.01.2024, before the start of the exam session.

    Please sign up using your or eMail address. Some FAQ for the forum are here.

    Office hours

    Office hours are offered by group 3 on Mondays and Thursdays from 12:00 to 13:00 in HG G 32.6 starting the fourth week of the semester.


    Please note that the exam for this course will be written. Further information is found in the course catalogue (further information concerning e.g. aids and length is to be found there).

    Preparation for Exam. (made available one month before the day of the exam)

    Exam - Suggested solution (uploaded 21/03)

    Exam review after the announcement of the grades: for information about the exam review, please consult this website.

    Sample Exam

    Preparation for Sample Exam.

    Sample Exam
    The sample exam should serve as broad guidance for the general format and style of the final exam.
    A suggested solution together with a rough grading scheme will be uploaded on November 16th. You can then compare your solutions to it. The suggested grading scheme provides a rough idea of the amount of detail that is expected from you in the exam and should help you prepare accordingly.
    Naturally the final exam will cover also topics of the course that haven’t been introduced up to now. Please note also that grading schemes are tailored to an exam so that the grading of the final exam may differ from that of the sample exam.
    The optional "more involved exercises" of the exercise sheets are not a requirement to be able to solve the exam. Their goal is to give the possibility to tackle more challenging problems for students who are interested in doing so.

    Suggestion solution of the Sample Exam with rough grading scheme

    Additional literature