Markov Processes, Winter term 2020/21
Tuesdays 12.15-14.00 and thursdays 12.15-14.00, online via Zoom. Please register on eCampus for this course.
Lecture course: Andreas Eberle
Exercises: Francesco De Vecchi
Tutorial classes: Group 1 Mondays 10-12, Zeichensaal. Group 2 Mondays 16-18, online via Zoom.
Exam: oral
The course will cover a part of the following topics:
- Markov chains in discrete time (Generator, martingales, recurrence and transience, Harris Theorem, ergodic averages, central limit theorem)
- Markov chains in continuous time and piecewise deterministic Markov processes (Construction, generator, forward and backward equations, interacting particle systems on finite graphs)
- General Markov processes (Semigroups and generators, Feller and L2 approach, martingale problem, Brownian motion with different boundary behaviours, h transform, diffusions, interacting particle systems on Zd)
- Long time behaviour (ergodic theory, couplings and contractivity, mixing times, functional inequalities, phase transitions)
- Limits (Weak convergence, Donsker invariance principle, limits of martingale problems)
Prerequisites: Conditional expectations, martingales, Brownian motion. Most importantly, you should be familiar with the general definition of conditional expectations and conditional probability laws as summarized here, see also Problem Sheet 0.
My lecture notes of the foundations course on Stochastic Processes are available here. There you find all the necessary background material. Alternatively, you may consult the more compact book Probability Theory by Varadhan. Sections up to 5.5 have been covered in previous courses and will be assumed. Section 5.7 and Chapter 6 will be covered in this course.
Lecture Notes: The most recent version of the lecture notes is available here. Please let me know any corrections (small or large) !
Further Material:
- Liggett: Continuous-time Markov processes
- Stroock: An introduction to Markov processes
- Pardoux: Markov processes and applications
- Ethier/Kurtz: Markov processes: Characterization and convergence
- Bass: Stochastic processes
- Bakry/Gentil/Ledoux: Analysis and geometry of Markov diffusion operators
- Meyn/Tweedie: Markov chains and stochastic stability
- Brémaud: Markov chains
- Levin/Peres/Wilmer: Markov chains and mixing times
- Varadhan: Probability Theory
- Hairer: Convergence of Markov processes (Lecture notes)
- Lindgren: Lectures on stationary stochastic processes (Lecture Notes)
Problem Sheets can be found under "Material for the course" on eCampus.
Remark on exercises with a programming part: These can be done in a programming language/system of your choice - just submit a printout of your listings. If you do not have another preference, we recommend Python with Jupyter Notebooks for which examples and model solutions will be provided here.
Installation: We recommend the distribution Anaconda. It already contains all packages that we will need (at first, these are mainly NumPy for numerical computations and matplotlib for 2D-plots), as well as many others. A brief intoduction and help for installation can be found here and here. Once you have installed and started Jupyter notebooks, you should go through and execute the following example notebook (ipynb-format) which contains a short introduction to Python that concludes with a simulation of random walks:
A good next reference are the Scipy lecture notes that contain introductions to Python, NumPy and matplotlib. Further useful references:
- SciPy: Lecture notes, Getting started, Resources
- NumPy: Quickstart, Array creation, Comparison to MATLAB
- matplotlib: Gallery, Tutorials
- Python tutorial
November 2020 Andreas Eberle eberle@uni-bonn.de