Spring 2021: PHYS 601 Methods of Theoretical Physics II
Tue-Thu 12:00-1:15 pm; Room PHYS 223 (First lecture is in person and the following lectures will be online using the Zoom application). The videos of the lectures are on posted YouYube here. If you have signed up for the course, or planning to sign up please send me an email so that I can invite you to the SLACK channel.
Prerequisites: Basic familiarity with quantum mechanics and statistical physics. No previous knowledge of renormalization group is assumed.
Course description: In this course, we discuss two main applications of information theory (classical and quantum) in physics: Inference and Coarse-graining.
The course is divided into three parts:
Part I discusses some basic notions of information theory, probability and inference.
Part II discusses tensor networks and their applications in physics.
Part III discusses coarse-graining, scaling and renormalization group in physics.
The goal of the course is to highlight the information-theoretic origin of some tools and techniques used in statistical physics such as the Gibbs state, Monte Carlo simulations, coarse-graining, scaling and the renormalization group.
Part I: Classical Information, Probability and Inference
References: For part I of the course we will mostly follow lectures from David McKay’s textbook “Information theory, probability and learning algorithms” that can be found here for free.
Other recommended resources for the first part of the course are these video lectures by McKay.
Part II: Tensor Networks and Their Applications
References: For part II of the course we will mostly follow these lectures by Bridgeman and Chubb.
Other useful references:
https://tensornetwork.org/reviews_resources.html
The crash course at the Perimeter Institute: the videos are here
Part III: Scaling and Renormalization Group
References: For part III of the course we will mostly follow these lectures by John McGreevy.
Other useful resources:
Simon Dedeo’s lectures on renormalization with focus on the applications outside of physics.
Nigel Goldenfeld’s book “Lectures on Phase Transition and Renormalization Group”
John Cardy’s book “Scaling and renormalization in statistical physics”
David Tong’s lectures on “Statistical field theory”
We will use the application Slack to post course announcements, reports, discuss ideas and share interesting papers we have come across. If you are planning to attend the class please send me an email and I will add you to the course channel on Slack. All lectures are going to be recorded and posted on YouTube.
Homework:
The students should read the relevant section of the lecture notes before coming to class and submit two questions on slack. Every lecture will have homework problems that will be assigned in class and have to be submitted on Slack up to a week after the lecture.
Syllabus:
Click on the lecture to download the lecture note. The syllabus is subject to change as the course progresses.
Part I- Classical Information, Probability and Inference
- Lecture 0: Information is Physical
- Lecture 1: Introduction to Information Theory (reading: Section 1 of McKay’s book)
- Lecture 2: Probability, Entropy and Inference (reading: Section 2 of McKay’s book)
- Lecture 3: More about Inference (reading: Section 3 of McKay’s book)
- Lecture 4: An Example Inference Task: Clustering (reading: Section 20 of McKay’s book)
- Lecture 5: Exact Inference, Maximum Likelihood and Clustering (reading: Section 21 and 22 of McKay’s book)
- Lecture 6: Exact Marginalization (reading: Section 23 and 24 of McKay’s book)
- Lecture 7: Laplace’s Method, Model Comparison and Occam’s Razor (reading: Sections 27 and 28 of McKay’s book)
- Lecture 8: Monte Carlo Methods I (reading: Section 29 up to the end of 29.3 McKay’s book)
- Lecture 9: Monte Carlo Methods II (reading:Section 29.4 up to the end of 29.6 McKay’s book)
- Lecture 10: Efficient Monte Carlo Methods (reading: Section 30 of McKay’s book)
- Lecture 11: Ising Models (reading: Section 31 of McKay’s book)
- Lecture 12: Variational Methods I (reading: Section 33 of McKay’s book)
- Lecture 13: Variational Methods II
Solutions to Assignments
Homework 2
Homework 3
Homework 4, 5, 6
Homework 7, 8, 9, 10
Extra reading:
- Information Theory and Statistical Mechanics (reading: Jaynes, Edwin T. “Information theory and statistical mechanics.” Physical review 106.4 (1957): 620)
Part II- Tensor Networks: Variational Methods in Quantum Mechanics
- Lecture 14: Introduction to Tensor Network Notation (reading: Sections 0 and 1 of Lecture Notes)
- Lecture 15: Introduction to Tensor Network Notation II (reading: Sections 2 and Section 4.1 of Preskill’s lectures)
- Lecture 16: Matrix Product States (reading: Section 3 of Lecture Notes)
- Lecture 17: Quantum Phases (reading: Sections 4 of Lecture Notes)
Extra readings:
- Tensor Network Algorithms (reading: Sections 5 of Lecture Notes)
- PEPS and MERA (reading: Sections 6 and 7 of Lecture Notes)
- Introduction to Machine Learning; selected uses of Tensor Networks in Machine Learning watch the slides.
Part III- Scaling and Renormalization
- Lecture 18: Scaling and Self-similarity (reading: Sections 0 and 1 of Lecture Notes)
- Lecture 19: Random Walks (reading: Sections 2 of Lecture Notes)
- Lecture 20: Random Walks II
- Lecture 21: Ising Model I (reading: Sections 3 of Lecture Notes)
- Lecture 22: Ising Model II
- Lecture 23: Mean Field Theory I (reading: Sections 4 of Lecture Notes)
- Lecture 24: Mean Field Theory II
- Lecture 25: Festival of Rigor (reading: Sections 5 of Lecture Notes)
- Lecture 26: Field Theory I (reading: Sections 6 of Lecture Notes)
- Lecture 27: Field Theory II
- Lecture 28: Field Theory III
Extra reading:
- Scaling and Operator Product Expansion (reading: Sections 7 and 8 of Lecture Notes)
- Lower Dimensions and Continuous Symmetry (reading: Section 9 of Lecture Notes)
- RG approach to walking (reading: Section 10 of Lecture Notes)
- RG Sampler Platter (reading: Section 11 of Lecture Notes)
If you have any questions/comments or suggestions please do not hesitate to send me a message.
Term projects:
1) Quantum analog of the Bayes Rule
2) Topological phases of matter
3) Renormalization Group approach to Period Doubling