If you’ve looked at some classic papers in energy-based neural networks (e.g., the Hopfield neural network, the Boltzmann machine, the restricted Boltzmann machine, and all forms of deep learning), you’ll see that they don’t use the word “entropy.” At the same time, we’ve stated that entropy is a fundamental concept in these energy-based neural networks.… Continue reading Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)
Tag: Museum of Alexandria
Understanding Entropy: Essential to Mastering Advanced AI
Have you been stumped when trying to read the classic AI papers? Are notions such as free energy and entropy confusing? This is probably because ALL areas of advanced AI are based, to some extent, on statistical mechanics. That means that you need to understand some “stat mech” rudiments to get through those papers. One… Continue reading Understanding Entropy: Essential to Mastering Advanced AI