From the lectures

Brief reports from the lectures with links to lecture notes and podcasts:

Wednesday, August 19th: Started lecturing from sections 1.1-1.2, but rearranged the material a little bit, postponing the material on monotone classes til next time. More precisely, I have covered all of section 1.2, except the second half of Proposition 1.10 (which I will do next time), and most of 1.1 except the material on fields and monotone classes. Podcast (sorry about the sound). Notes.

Thursday, August 20th: Completed sections 1.1-1.2. The emphasis was on the relationship between \(\sigma\)-algebras, algebras and monotone classes, and the highlight (or lowlight) was the long and complicated proof of The Monotone Class Theorem (the proof itself isn't that important, but the proof technique will also be useful in other contexts). I shall leave section 1.3 (which mainly consists of examples) to self-study and continue with sections 1.4 and 1.5 next time. Podcast. Notes.

Wednesday, August 26th: I started by showing a typical argument using the Monotone Class Theorem, and then went through sections 1.4 and 1.5, following the book rather closely. I then started Chapter 2 where I introduced random variables and fleshed out some of the arguments (e.g. by solving most of exercise 2.1) - the book is very sketchy here. Next time I shall start with Definition 2.2.

The projector didn't work today, so I had to give the lecture on the blackboard, and there are no podcast and no notes. However, this podcast from last year in combination with the first half of this one covers the same material. For notes, see the second half of this and the first half of this (all in Norwegian unfortunately).

Thursday, August 27th: Before the break I defined distribution functions and proved the properties in Propositions 2.5 and 2.6. After the break I solved problems 1.3, 1.5, 1.21, 1.22, 1.23 (you can find solutions of 1.6, 1.8, 1.9, 1.16 in the notes from last year). Notes.

The projector worked faultlessly today, but there was a lot of discussion at the end of the lecture, and unfortunately it seems that I forgot to upload the podcast and that it is hopelessly lost. You may try to watch the corresponding videos from last year instead: Before the break. After the break (from 48:20).

Wednesday, September 2nd: I first proved Proposition 2.8, used it to define distributions, and then proved Proposition 2.10. After the break, I sketched the proof of Theorem 2.14 before I defined independence of random variables. I outlined the proof of Theorem 2.30 and don't think I will return to the details tomorrow (you can find a more detailed proof in the lectures from last year: start at 50:00 here). Notes. Podcast.

Thursday, September 3rd: I first went quickly through Section 2.4 and then continued with Section 2.4 up to and including Remark 2.26. After the break I did the "extra problems" plus Exercise 2.2. Notes. Podcast (I changed microphone after the break and the sound got better then).

Wednesday, September 9th: I finished section 2.5 by proving Proposition 2.28 (and skipping all the interesting examples). I then went to section 3.1  where I defined the expectation for general random variables and started proving Theorem 3.4 (I managed to prove (i)-(iii) before time was up). The plan is to prove the rest of Theorem 3.4 plus Theorem 3.5 tomorrow. I will then go back to section 2.6 and say a few words about variances and moments without making much out of it. Notes. Podcast.

Thursday, September 10th: Before the break, I proved the rest of Theorem 3.4 plus Theorem 3.5. After the break, I did problems 2.4. 2.7, 2.10, extra problem, 2.30, 2.21 (in this order). The more theoretical problems 2.16, 2.17, and 2.20 where done last year (see notes and podcast in Norwegian). Notes. Podcast.

Wednesday, September 16th: I first went back to section 2.6 to say a few words about moments. Then I went through sections 3.2 (a little superficially) and 3.3. I then sprinted through section 3.4 where I just barely managed to prove Jensen's inequality. Next time, I'll say a few words about Lyapounov's inequalities (Corollary 3.23) and then go to chapter 4. Notes. Podcast.

Thursday, September 17th: I first proved Lyapounov's inequalities (Cor. 3.23) and then introduced the many of ways of converging in Section 4.1, where I also proved Prop. 4.4 and 4.5, and presented Example 4.5.1. After the break I did problems 3.3, 3.5, 2.37, and (a quick sketch of) 3.4. (The presentation of 3.4 is probably better on last year's podcast). Next time, I'll begin with \(\limsup\) and \(\liminf\) for sets and then prove the Borel-Cantelli's lemma. I also hope to make some headway with the important Section 4.2. Notes. Podcast (the sound is better after the break).

Wednesday, September 23rd: I started by introducing \(\limsup\) and \(\liminf\) for sets. I then went a little more deeply into the relationship between the product \(\Pi_n(1-x_n)\) and the sum \(\sum_n x_n\) than the book does, before proving the Borel-Cantelli Lemma. After the break. I gave a simple but typical probabilistic application of the lemma and then used it to prove Theorem 4.10. I then moved on to Section 4.2 where I just had time to finish the proof of Lemma 4.11. Notes. Podcast.

Thursday, September 24th. Before the break, I proved the Monotone Convergence Theorem, Fatou's Lemma, and (Lebesgue's) Dominated Convergence Theorem. The last one I only proved for convergence a.s., and I'll return to the proof for convergence in probability next time. After the break, I did problems 3.13, 3.19, (part of) 2.41, and 2.42. Notes. Podcast.

Wednesday, September 30th: I first finished Section 4.2 by proving the Dominated Convergence Theorem for convergence in probability plus Corollary 4.20. I then skipped Section 4.3 (interesting application but we shall return to a similar example later) and went to Section 5.1 and proved Theorem 5.1 and 5.2. Then I turned to Section 5.4 where I got to Prop. 5.16. Notes. Podcast.

Thursday, October 1st: Before the break I finished Section 5.4 and dealt with 5.5 (mainly definitions). The problem session was rather short as there was a long discussion in the break, but I covered the two problems from last years exam/trial exam plus problems 4.3 and 4.7 from the book. Notes. Podcast.

Wednesday, October 7th: I finished Chapter 5 by proving the 0-1-law in Theorem 5.22 and then went on to Chapter 6 where I first said a few words about complex valued random variables, including proving that \(|E(Z)|\leq E(|Z|)\) as this is quite essential in many arguments. I then introduced characteristic functions and proved Theorems 6.3 and 6.4(i) (leaving the induction step in the latter to you). The lecture is a little bit shorter than usual as we had some computer problems in the beginning. Notes. Podcast

Thursday, October 8th: Before the break, I looked at the characteristic functions of sums of independent random variables and computed the characteristic function of one-dimensional gaussian distributions (somewhere along the way I managed to lose a minus in the exponent and the characteristic functions should be \(\phi(t)=e^{\frac{-t^2}{2}}\) and \(\phi(t)=e^{-\frac{\sigma^2 t^2}{2}+it\mu}\)). After the break, I did problems 4.11, 4.23, assignment 2019 no 1, and 4.13 in this order. The taped lecture is a little shorter than usual as we had an informal discussion before the recording started. Notes. Podcast.

Wednesday, October 14th: I first went through the note on Fourier inversion and then proved Lévy's inversion Theorem 6.11 and it's corollary 6.13. I then turned to Section 6.2, where I explained Example 6.13.2 and proved Proposition 6.15. Notes. Podcast,

Thursday, October 15th: Before the break, I proved Theorem 6.17, a very useful and important result. After the break, i did problem 4 from Trial Exam 1, 2019, and problems 5.10 and 5.11 from the book (the last one in a rather hurried fashion). I realized later that the solution I gave for the second part of 5.10 is insufficient as it only shows that \(\frac{X_1+X_2+\cdots X_n}{n}\) fails to converge to 0 a.s. and not that it fails to converge in probability. Click here for a correct solution. Notes. Podcast.

Wednesday, October 21st: I first proved Helly's Theorem (6.19) and its corollary 6.22. I then gave a short survey of subsection 6.2.1 before I finished by proving Proposition 6.29 and Lemma 6.30. Next time we shall prove Lévy's Continuity Theorem 6.32. Notes. Podcast.

Thursday, October 22nd: Proved Lévy's Continuity Theorem 6.32 and the simple but useful Lemma 6.34. After the break, I did problems 6.2, 6.3, 6.5, and 6.6. Notes. Podcast.

Wednesday, October 28th: I spent almost all the time on proving the two versions of the Central Limit Theorem, 6.37 and 6.38. In the book, there is a misprint in the statement of Theorem 6.37: The displayed formula should be \(\frac{S_n-n\mu}{\sqrt{n\sigma^2}}\) and not \(\frac{S_n-\mu}{\sqrt{n\sigma^2}}\). Towards the end of the lecture, I covered the material on stochastic processes from Section 7.1. Next time, I shall cover Section 7.4 on stopping times, and if you find them hard to grasp, there is a short note here that may make them seem more natural (but try to read 7.4 first!). Notes. Podcast.

Note: In the lecture, there is an inaccuracy in the proof of Lyapounov's version of the Central Limit Theorem. In Remark (iii) before the proof begins, I should have been a little bit more careful and observed that since the argument applies to any \(j\) less than \(n\), we have actually proved that \(\sup_{j\leq n}\left(\frac{\sigma_j^2}{s_n^2}\right)^{3/2}\leq\frac{\sum_{j=1}^n\gamma_j}{s_n^3}\), and hence \(\sup_{j\leq n}\frac{\sigma_j^2}{s_n^2}\to 0 \mbox{ as }n\to\infty\). This is what is actually needed in the last two lines of the proof. The argument in the book is correct at this point (although a little hard to read).

Thursday, October 29th. Before the break I covered Section 7.4 on stopping times. After the break I did problems 6.9, 6.21, and 6.22. Next time, we'll do Chapter 8. Notes. Podcast.

Wednesday, November 4th: The physical lecture was cancelled, but you will find a podcast below covering all of chapter 8 except Jensen's Inequality for conditional expectations (which I shall do next time). As I had some problems with the equipment (long time since I have recorded with an iPad!), the podcast is split in two parts and the notes in three parts. Notes 1 and 2 correspond to the first part of the podcast. Notes 1, Notes 2, Notes 3, Podcast 1, Podcast 2.

Thursday, November 5th: Before the break, I did the proof of Jensen's inequality for conditional expectations in gruesome detail and then just had time to define martingales, submartingales, and supermartingales (Section 9.1) and give some motivation. After the break, I did problems 6.29, 7.16, and Problem 1 from Trial Exam 1, 2019. Notes. Podcast.

Wednesdy, November 11th: Continued lecturing from Chapter 9. Covered section 9.1 and proved Theorem 9.8 and 9.9 from Section 9.2. I did things a little differently from the book by introducing the increments \(\Delta X_n=X_{n+1}-X_n\) which I think makes some arguments more transparent. Tomorrow I shall finish Section 9.2 and hence this years syllabus. This may leave a little less time for problems than usual. Notes. Podcast.

Thursday, November 12th: Finished the syllabus by completing Section 9.2. After the break, I did all this week's problems except 8.7 (which I did most of last year). Next week I shall review parts of the syllabus (plus do problems on Thursday). Notes. Podcast.

Wednesday. November 18th: In the reviews, I try to connect themes in the syllabus that belong together although they may have been taught at different times. In this first review lecture, I concentrated on:

(i) Families of sets: \(\sigma\)-algebras, algebras, monotone classes, conditional expectations, filtrations, martingales, and \(\sigma \)-algebras as information.

(ii) Probability measures including independence.

(iii) Random variables, including expectations, independence, distribution functions, and distributions.

Notes. Podcast

Thursday, November 19th: Continued the review by looking at:

(i) Different ways a sequence of random variables can converge: almost surely, in probability, in expectation and in \(L^p\), and in distribution.

(ii) Analytic limit theorems: Monotone convergence, Fatou's lemma, Dominated convergence.

(iii) Probabilistic limit theorems: Laws of large numbers, Central limit theorems. Lévy's continuity theorem.

(iv) Zero-one laws: Borel-Cantelli's lemma, Kolmogorov/Borel's zero-one law.

(v) Characteristic functions: Definition, differentiability, Lévy's inversion theorem.

(vi) Inequalities: Chebyshev, Schwarz, Lyapounov, Jensen, Martingal maximal inequality.

Notes. Podcast (the start is a little slow as I had wait for the screen to come down)

The final batch of weekly problems:

Problems 9.3 and 9.5: Notes. Podcast.

Problems 9.7-9.10 (from last year and in Norwegian): Notes 9.7-9.9. (you need to scroll) Notes 9.10. Video 9.7-9.9 (go to 52:40). Video 9.10.

Problems from trial exams and exams: Notes. Podcast.

Published Aug. 13, 2020 8:42 AM - Last modified July 11, 2023 1:57 PM