Statistical Mechanics: Algorithms and Computations
Description
When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan .
- Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
- Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.
About this course: In this course you will learn a whole lot of modern physics (classical and quantum) from basic computer programs that you will download, generalize, or write from scratch, discuss, and then hand in. Join in if you are curious (but not necessarily knowledgeable) about algorithms, and about the deep insights into science that you can obtain by the algorithmic approach.
Created by: École normale supérieure-
Taught by: Werner Krauth, Directeur de recherches au CNRS
Department of physics

Frequently asked questions
There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.
When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan .
- Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
- Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.
About this course: In this course you will learn a whole lot of modern physics (classical and quantum) from basic computer programs that you will download, generalize, or write from scratch, discuss, and then hand in. Join in if you are curious (but not necessarily knowledgeable) about algorithms, and about the deep insights into science that you can obtain by the algorithmic approach.
Created by: École normale supérieure-
Taught by: Werner Krauth, Directeur de recherches au CNRS
Department of physics
Each course is like an interactive textbook, featuring pre-recorded videos, quizzes and projects.
Help from your peersConnect with thousands of other learners and debate ideas, discuss course material, and get help mastering concepts.
École normale supérieure L’École normale supérieure (ENS) est un établissement d'enseignement supérieur pour les études prédoctorales et doctorales (graduate school) et un haut lieu de la recherche française. L'ENS offre à 300 nouveaux étudiants et 200 doctorants chaque année une formation de haut niveau, largement pluridisciplinaire, des humanités et sciences sociales aux sciences dures. Régulièrement distinguée au niveau international, l'ENS a formé 10 médailles Fields et 13 prix Nobel.Syllabus
WEEK 1
Monte Carlo algorithms (Direct sampling, Markov-chain sampling)
Dear students, welcome to the first week of Statistical Mechanics: Algorithms and Computations! <br> Here are a few details about the structure of the course: For each week, a lecture and a tutorial videos will be presented, together with a downloadable copy of all the relevant python programs mentioned in the videos. Some in-video questions and practice quizzes will help you to review the material, with no effect on the final grade. A mandatory peer-graded assignment is also present, for weeks from 1 to 9, and it will expand on the lectures' topics, letting you reach a deeper understanding. The nine peer-graded assignments will make up for 50% of the grade, while the other half will come from a final exam, after the last lecture. <br> In this first week, we will learn about algorithms by playing with a pebble on the Monte Carlo beach and at the Monaco heliport. In the tutorial we will use the 3x3 pebble game to understand the essential concepts of Monte Carlo techniques (detailed balance, irreducibility, and a-periodicity), and meet the celebrated Metropolis algorithm. Finally, the homework session will let you understand some useful aspects of Markov-chain Monte Carlo, related to convergence and error estimations.
3 videos, 2 readings, 1 practice quiz expand
- Video: Lecture 1: Introduction to Monte Carlo algorithms
- Reading: Python programs and references
- Reading: Errata (Lecture 1)
- Video: Tutorial 1: Exponential convergence and the 3x3 pebble game
- Video: Homework Session 1: From the one-half rule to the bunching method
- Practice Quiz: Practice quiz 1: spotting a correct algorithm
Graded: From the one-half rule to the bunching method
WEEK 2
Hard disks: From Classical Mechanics to Statistical Mechanics
In Week 2, you will get in touch with the hard-disk model, which was first simulated by Molecular Dynamics in the 1950's. We will describe the difference between direct sampling and Markov-chain sampling, and also study the connection of Monte Carlo and Molecular Dynamics algorithms, that is, the interface between Newtonian mechanics and statistical mechanics. The tutorial includes classical concepts from statistical physics (partition function, virial expansion, ...), and the homework session will show that the equiprobability principle might be more subtle than expected.
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 2: Hard disks: from Classical Mechanics to Statistical Mechanics
- Reading: Python programs and references
- Video: Tutorial 2: Equiprobability, partition functions, and virial expansions for hard disks
- Video: Homework Session 2: Paradoxes of hard-disk simulations in a box
- Practice Quiz: Practice quiz 2: spotting a correct algorithm (continued)
Graded: Paradoxes of hard-disk simulations in a box
WEEK 3
Entropic interactions and phase transitions
After the hard disks of Week 2, in Week 3 we switch to clothe-pins aligned on a washing line. This is a great model to learn about the entropic interactions, coming only from statistical-mechanics considerations. In the tutorial you will see an example of a typical situation: Having an exact solution often corresponds to finding a perfect algorithm to sample configurations. Finally, in the homework session we will go back to hard disks, and get a simple evidence of the transition between a liquid and a solid, for a two-dimensional system.
3 videos, 2 readings, 1 practice quiz expand
- Video: Lecture 3: Entropic interactions, phase transitions
- Reading: Python programs and references
- Video: Tutorial 3: Algorithms, exact solutions, thermodynamic limit
- Reading: Errata (Tutorial 3)
- Video: Homework Session 3: Two-dimensional liquids and solids
- Practice Quiz: Practice quiz 3: Spotting a correct algorithm (continued)
Graded: Two-dimensional liquids and solids
WEEK 4
Sampling and integration
In Week 4 we will deepen our understanding of sampling, and its connection with integration, and this will allow us to introduce another pillar of statistical mechanics (after the equiprobability principle): the Maxwell and Boltzmann distributions of velocities and energies. In the homework session, we will push the limits of sampling until we can compute the integral of a sphere... in 200 dimensions!
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 4: Sampling and Integration - From Gaussians to the Maxwell and Boltzmann distributions
- Reading: Python programs and references
- Video: Tutorial 4: Sampling discrete and one-dimensional distributions
- Video: Homework Session 4: Sampling and integration in high dimensions
- Practice Quiz: Practice quiz 4: four disks in a box
Graded: Sampling and integration in high dimensions
WEEK 5
Density matrices and Path integrals (Quantum Statistical mechanics 1/3)
Week 5 is the first episode of a three-weeks journey through quantum statistical mechanics. We will start by learning about density matrices and path integrals, fascinating tools to study quantum systems. In many cases, the Trotter approximation will be useful to consider non-trivial systems, and also to follow the time evolution of a system. All these topics, including the matrix-squaring technique, will be reviewed in detail in the homework session, where you will also study the anharmonic potential. <br> Note that previous knowledge of quantum mechanics is not really necessary to go through the next three weeks. Follow us in our journey through algorithms and physics, and don't forget to ask on the forum if you have any doubt!
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 5: Density matrices and path integrals
- Reading: Python programs and references
- Video: Tutorial 5: Trotter decomposition and quantum time-evolution
- Video: Homework session 5: Quantum statistical mechanics and Quantum Monte Carlo
- Practice Quiz: Practice quiz 5: Four disks in a box (continued)
Graded: Quantum statistical mechanics and Quantum Monte Carlo
WEEK 6
Lévy Quantum Paths (Quantum Statistical mechanics 2/3)
In Week 6, the second quantum week, we will introduce the properties of bosons, indistinguishable particles with peculiar statistics. At the same time, we will also go further by learning a powerful sampling algorithm, the Lévy construction, and in the homework session you will thoroughly compare it with standard sampling techniques.
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 6: Lévy sampling of quantum paths
- Reading: Python programs and references
- Video: Tutorial 6: Bosonic statistics (with wave functions)
- Video: Homework session 6: Path sampling: A firework of algorithms
- Practice Quiz: Practice quiz 6: Path integrals
Graded: Path sampling: A firework of algorithms
WEEK 7
Bose-Einstein condensation (Quantum Statistical mechanics 3/3)
At the end of our quantum journey, in Week 7, we discuss the Bose-Einstein condensation phenomenon, theoretically predicted in the 1920's and observed in the 1990's in experiments with ultracold atoms. In the path-integral framework, an elegant description of this phenomenon is in term of permutation cycles, which will also lead to a great sampling algorithm, to be discussed in the homework session.
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 7: Quantum indiscernability and Bose-Einstein condensation
- Reading: Python programs and references
- Video: Tutorial 7: Permutation cycles and ideal Bosons
- Video: Homework session 7: Bosons in a trap - Bose-Einstein condensation
- Practice Quiz: Practice quiz 7: BEC
Graded: Bosons in a trap - Bose-Einstein condensation
WEEK 8
Ising model - Enumerations and Monte Carlo algorithms
In Week 8 we come back to classical physics, and in particular to the Ising model, which captures the essential physics of a set of magnetic spins. This is also a fundamental model for the development of sampling algorithms, and we will see different approaches at work: A local algorithm, the very efficient cluster algorithms, the heat-bath algorithm and its connection with coupling. All of these will be revisited in the homework session, where you will get a precise control over the transition between ordered and disordered states.
3 videos, 1 reading, 1 practice quiz expand
- Video: Lecture 8: Ising model - From enumeration to Cluster Monte Carlo Simulations
- Reading: Python programs and references
- Video: Tutorial 8: Ising model - Heat bath algorithm, coupling of Markov chains
- Video: Homework session 8: Cluster sampling, perfect sampling in the Ising model
- Practice Quiz: Practice quiz 8: Spins and Ising model
Graded: Cluster sampling, perfect sampling in the Ising model
WEEK 9
Dynamic Monte Carlo, simulated annealing
Continuing with simple models for spins, in Week 9 we start by learning about a dynamic Monte Carlo algorithm which runs faster than the clock. This is easily devised for a single-spin system, and can also be generalized to the full Ising model from Week 8. In the tutorial we move towards the simulated-annealing technique, a physics-inspired optimization method with a very broad applicability. You will also revisit this in the homework session, and apply it to the sphere-packing and traveling-salesman problems.
3 videos, 1 reading expand
- Video: Lecture 9: Dynamical Monte Carlo and the Faster-than-the-Clock approach
- Reading: Python programs and references
- Video: Tutorial 9: Simulated Annealing and the 13-sphere problem
- Video: Homework session 9: Simulated Annealing for sphere packings and the travelling salesman problem
Graded: Simulated Annealing for sphere packings and the travelling salesman problem
WEEK 10
The Alpha and the Omega of Monte Carlo, Review, Party
The lecture of Week 10 includes the alpha and the omega of our course. First we repeat the experiment of Buffon's needle, already performed in the 18th century, and then we touch the sophisticated theory of Lévy stable distributions, and their connection with the central limit theorem. In the tutorial there will be time for a review of the entire course material, and then a little party is due, to celebrate the end of the course! <br> (There is no homework session for Week 10, but don't forget that the final exam is still there!)
2 videos, 1 reading expand
- Video: Lecture 10: The Alpha and the Omega of Monte Carlo
- Reading: Python programs and references
- Video: Tutorial 10: Review - Party - Best of
Graded: Final Exam 2016
Share your review
Do you have experience with this course? Submit your review and help other people make the right choice. As a thank you for your effort we will donate $1.- to Stichting Edukans.There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.