Book – Statistical Mechanics, Neural Networks, and Artificial Intelligence

This page gives you access to the Précis as well as select chapters from the book-in-progress, Statistical Mechanics, Neural Networks, and Artificial Intelligence.

Chapter 9 shows how the equations underlying the Hopfield network and the (restricted) Boltzmann machine, respectively, indicate their architectures and thus their capabilities.

The overview is in the Précis, and Chapter 9: The Hopfield Network and the Restricted Boltzmann Machine, work out the details.

These chapters are essential reading for the flagship Themesis short course, Top Ten Terms (that You Need to Know) in Statistical Mechanics for Artificial Intelligence. (We colloquially refer to this as the Top Ten Terms course.)


Précis

The Précis for Statistical Mechanics, Neural Networks, and Artificial Intelligence (the book) is a short overview of the book’s contents. It can be read independently of the book, and can be used as an overview and introduction.

Use this link to access the Précis.


Select Chapters

We are making select chapters (from the middle of the book) available now. These are the chapters that deal with statistical mechanics and energy-based neural networks.

The prior chapters provide introductions and general tutorials on neural networks. You do not need the prior chapters to dive into the chapters that we’ve selected here. Jump right in!

  • Chapter 9: The Hopfield Network and the Boltzmann Machine – a relatively easy-to-read discussion of the single equation (with two variants) that governs both the Hopfield network and the Boltzmann machine; chapter presents the two variant equations (one from each network), eight figures, lots of explanation-text – the EASIEST possible discussion of how their respective equations connect to their architectures and vice-versa,
  • Chapter 10: Introduction to Statistical Mechanics: Microstates and the Partition Function – statistical mechanics relative to neural networks – as reader-friendly as statistical mechanics can possibly be; lots of figures, example of computing microstates (essential to understanding partition functions, entropy, and free energy).
  • Chapter 11: Free Energy – free energy relative to neural networks and variational inference – a lot of attention to notation.