Kullback-Leibler, Etc. – Part 2.5 of 3: Black Diamonds

We need a “black diamond” rating system to mark the tutorials, YouTubes, and other resources that help us learn the AI fundamentals. Case in point: Last week, I added a blogpost by Damian Ejlli to the References list. It is “Three Statistical Physics Concepts and Methods Used in Machine Learning.” (You’ll see it again in… Continue reading Kullback-Leibler, Etc. – Part 2.5 of 3: Black Diamonds

The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 2 of 3

Free energy is the universal solvent of AI (artificial intelligence). It is the single underlying rule or principle that makes AI possible. Actually, that’s a simplification. There are THREE key things that underlie AI – whether we’re talking deep learning or variational methods. These are: Free energy – which we’ll discuss in this post, Latent… Continue reading The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 2 of 3

The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 1.5 of 3

Let’s talk about the Kullback-Leibler divergence. (Sometimes, we call this the “K-L divergence.”) It’s the foundation, the building block, for variational methods. The Kullback-Leibler divergence is a made-up measure. It’s not one of those “fundamental laws of the universe.” It’s strictly a made-up human thing. Nevertheless, it’s become very useful – and is worth our… Continue reading The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 1.5 of 3

The Kullback-Leibler Divergence, Free Energy, and All Things Variational (Part 1 of 3)

Variational Methods: Where They Are in the AI/ML World The bleeding-leading edge of AI and machine learning (ML) deals with variational methods. Variational inference, in particular, is needed because we can’t envision every possible instance that would comprise a good training and testing data set. There will ALWAYS be some sort of oddball thing that… Continue reading The Kullback-Leibler Divergence, Free Energy, and All Things Variational (Part 1 of 3)

Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)

If you’ve looked at some classic papers in energy-based neural networks (e.g., the Hopfield neural network, the Boltzmann machine, the restricted Boltzmann machine, and all forms of deep learning), you’ll see that they don’t use the word “entropy.” At the same time, we’ve stated that entropy is a fundamental concept in these energy-based neural networks.… Continue reading Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)

Understanding Entropy: Essential to Mastering Advanced AI

Have you been stumped when trying to read the classic AI papers? Are notions such as free energy and entropy confusing? This is probably because ALL areas of advanced AI are based, to some extent, on statistical mechanics. That means that you need to understand some “stat mech” rudiments to get through those papers. One… Continue reading Understanding Entropy: Essential to Mastering Advanced AI