In order to read any of the classic (and important) papers on energy-based neural networks, we need to know the vocabulary and essential concepts from:
- Statistical mechanics,
 - Bayesian probability,
 - The Kullback-Leibler divergence, and more.
 
In today’s associated YouTube video, we illustrate how these different terms – and their respective disciplines – are blended together, using the Salakhutdinov and Hinton (2012) paper as a reference point.
We’re preparing a short course that will introduce the essential vocabulary – concentrating on the Top Ten Terms in Statistical Mechanics.