Top Ten Terms: Day 3 – Free Energy

Free energy is probably the most important principle for us to understand and assimilate. It is essential – it is foundational – to both energy-based neural networks, and also to all machine learning variational methods.

Figure 1. Free energy is not really “free.” However, in a thermodynamic sense, it is the energy available to do work – such as drive a piston in an engine, or contribute to (or be absorbed from) a chemical reaction.

In the examples that we’ve used in the video for Day 2: Thermodynamics, free energy is the energy that is available, after heat loss, to do work – such as heat a room, move a piston, or spin little angels around in a decoration.


Free Energy: The Basic Equation

Free energy can be defined two ways; one equation is that the free energy is the enthalpy of a system (the total energy available) minus temperature times the entropy (the energy lost due to friction or other losses).

In AI applications, we divide through by all energy and temperature-related terms, giving us a reduced free energy equation which has no units. We show this in Figure 2.

Figure 2. Free energy can be defined as enthalpy (the total energy available in the system) minus the entropy (energy lost due to non-work activities, such as heat loss). The equation shown here is a reduced equation; all terms have been divided-through by constants containing units of energy and temperature; this resulting reduced equation has no units; it is simply scalar quantities.

YouTube: An Overview of Our Course Content

This YouTube contains a bit about free energy, and also discusses other topics that we’ll address in this course. It’s less than six minutes long, and is mostly useful as an overview.

YouTube: Maren, Alianna J. 2021. “Statistical Mechanics: Foundational to Artificial Intelligence.” Themesis YouTube Channel.

Where Free Energy Fits in with Neural Networks and Machine Learning

The free energy notion is fundamental in both energy-based neural networks and variational inference.

  • Variational methods use the notion of free energy, which is a macroscopic thermodynamic principle.
  • Energy-based neural networks (the Little-Hopfield neural network, the Boltzmann machine, and all forms of deep learning as well as GANs – generative adversarial networks) use statistical mechanics principles, and free energy is still important in statistical mechanics!

Exercise: Observe and Reflect

Review the video for today (from the Themesis Thinkific Day 3 course content).

  • Copy down any equations that you see. Manually annotate each term – as in, what does S mean?

We’ll go deeper into the notion of free energy as we go through this course. In Week 2, we’ll have a book chapter devoted to free energy.


Reading

For now, read the Précis for the book Statistical Mechanics, Neural Networks, and Artificial Intelligence. This was available on your Day 0 Resources:

Day 0: Course Overview and Getting Started

This Précis is very dense; it is ok not to understand everything; think of this as an overview of material that we’ll cover through the next few weeks.

Figure 3. Machine learning – including variational inference, together with energy-based neural networks, involve a combination of statistical mechanics and Bayesian probabilities. Figure taken from the Précis for Statistical Mechanics, Neural Networks, and Artificial Intelligence.

You can jump ahead to tomorrow’s content on Day 4: Enthalpy.

You can also go back to yesterday’s content on Day 2: Thermodynamics.