We want to identify how and where the next “big breakthrough” will occur in AI. We use three tools or approaches to identify where this next big breakthrough will occur: The Quick Overview Get the quick overview with this YouTube #short: The Full YouTube Maren, Alianna J. 2023. “A New Neural Network Class: Creating the… Continue reading New Neural Network Class: Framework: The Future of AI (Part 2 of 3)
Category: NN – Hopfield NN
Kuhnian Normal and Breakthrough Moments: The Future of AI (Part 1 of 3)
Over the past fifty years, there have only been a few Kuhnian “paradigm shift” moments in neural networks. We’re ready for something new!
The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 2 of 3
Free energy is the universal solvent of AI (artificial intelligence). It is the single underlying rule or principle that makes AI possible. Actually, that’s a simplification. There are THREE key things that underlie AI – whether we’re talking deep learning or variational methods. These are: Free energy – which we’ll discuss in this post, Latent… Continue reading The Kullback-Leibler Divergence, Free Energy, and All Things Variational – Part 2 of 3
Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)
If you’ve looked at some classic papers in energy-based neural networks (e.g., the Hopfield neural network, the Boltzmann machine, the restricted Boltzmann machine, and all forms of deep learning), you’ll see that they don’t use the word “entropy.” At the same time, we’ve stated that entropy is a fundamental concept in these energy-based neural networks.… Continue reading Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)
Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)
Latent variables enabled effective energy-based neural networks. The key problem with the Little/Hopfield neural network was its limited memory capacity. This problem was resolved when Hinton, Ackley, and Sejnowski introduced the notion of latent variables, creating the Boltzmann machine. Seven key papers define the evolution of energy-based neural networks. Previously, we examined the first two… Continue reading Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)
Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)
Seven key papers provide us with the evolutionary timeline for energy-based neural networks, up through and including deep learning. The timeline for these papers begins with William Little’s 1974 work on the first energy-based neural network, and then John Hopfield’s 1982 expansion on Little’s concepts, up through deep learning architectures as described by Hinton and… Continue reading Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)