Markov Processes, Winter term 2022/23
Mondays 12.15-14.00, Großer Hörsaal, and Tuesdays 16.15-18.00, Kleiner Hörsaal, We. 10.
Graduate Seminar in summer term: preliminary meeting on Monday 16.1., 14.30 in N0.007
Lecture course: Andreas Eberle
Exercises: Stefan Oberdörster
Tutorial classes:
- Wednesday 16-18, N0.008, Shi Qijin
- Thursday 10-12, 0.007, Luke Eilers
Exam: oral (possible exam times on Tuesday 14.2., Thursday 23.2., Tuesday 28.2., Tuesday 28.3.)
The course will cover a part of the following topics:
- Markov chains in discrete time (Generator, martingales, recurrence and transience, Harris Theorem, ergodic averages, central limit theorem)
- Markov chains in continuous time and piecewise deterministic Markov processes (Construction, generator, forward and backward equations, interacting particle systems on finite graphs)
- General Markov processes (Semigroups and generators, Feller and L2 approach, martingale problem, Brownian motion with different boundary behaviours, h transform, diffusions, interacting particle systems on Zd)
- Long time behaviour (ergodic theory, couplings and contractivity, mixing times, functional inequalities, phase transitions)
- Limits (Weak convergence, Donsker invariance principle, limits of martingale problems)
Optionally, it can be combined with the two hour course V5F6 - Selected Topics in Applied Probability - Convergence of Markov processes and Markov Chain Monte Carlo methods which studies related questions from more a more applied perspective.
Prerequisites: Conditional expectations, martingales, Brownian motion. Most importantly, you should be familiar with the general definition of conditional expectations and conditional probability laws as summarized here. Some background on functional analysis can be helpful but is not assumed.
My lecture notes of the foundations course on Stochastic Processes are available here. There you find all the necessary background material. Alternatively, you may consult the more compact book Probability Theory by Varadhan. Sections up to 5.5 have been covered in previous courses and will be assumed. Section 5.7 and Chapter 6 will be covered in this course.
Lecture Notes: Lecture notes from previous two semester courses on Markov Processes are available here. Please let me know any corrections (small or large) !
Further Material:
- Liggett: Continuous-time Markov processes
- Stroock: An introduction to Markov processes
- Pardoux: Markov processes and applications
- Ethier/Kurtz: Markov processes: Characterization and convergence
- Bass: Stochastic processes
- Bakry/Gentil/Ledoux: Analysis and geometry of Markov diffusion operators
- Meyn/Tweedie: Markov chains and stochastic stability
- Brémaud: Markov chains
- Levin/Peres/Wilmer: Markov chains and mixing times
- Varadhan: Probability Theory
- Hairer: Convergence of Markov processes (Lecture notes)
- Lindgren: Lectures on stationary stochastic processes (Lecture Notes)
Problem Sheets
- Sheet 0 (to be discussed in the tutorials during the first week)
- Sheet 1 (hand in solutions until Monday 17.10., 12.15, post-box opposite to entrance of math library)
- Sheet 2 (hand in solutions until Monday 24.10., 12.15)
- Sheet 3 (corrected version of Exercise 1a) !) (Hand in solutions until Monday 31.10., 12.15)
- Sheet 4 (due Monday 7.11., 12.15)
- Sheet 5 (due Monday 14.11., 12.15)
- Sheet 6 (due Monday 21.11., 12.15)
- Sheet 7 (due Monday 28.11., 12.15)
- Sheet 8 (due Monday 5.12., 12.15)
- Sheet 9 (due Monday 12.12., 12.15)
- Sheet 10 (due Monday 19.12., 12.15)
- Sheet 11 (due Monday 9.1., 12.15)
- Sheet 12 (due Monday 16.1., 12.15)
- Sheet 13 (due Monday 23.1., 12.15)
Remark on exercises with a programming part: These can be done in a programming language/system of your choice - just submit a printout of your listings. If you do not have another preference, we recommend Python with Jupyter Notebooks for which examples and model solutions will be provided here.
Installation: We recommend the distribution Anaconda. It already contains all packages that we will need (at first, these are mainly NumPy for numerical computations and matplotlib for 2D-plots), as well as many others. A brief intoduction and help for installation can be found here and here. Once you have installed and started Jupyter notebooks, you should go through and execute the following example notebook (ipynb-format) which contains a short introduction to Python that concludes with a simulation of random walks:
A good next reference are the Scipy lecture notes that contain introductions to Python, NumPy and matplotlib. Further useful references:
- SciPy: Lecture notes, Getting started, Resources
- NumPy: Quickstart, Array creation, Comparison to MATLAB
- matplotlib: Gallery, Tutorials
- Python tutorial
November 2022 Andreas Eberle eberle@uni-bonn.de