Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)

If you’ve looked at some classic papers in energy-based neural networks (e.g., the Hopfield neural network, the Boltzmann machine, the restricted Boltzmann machine, and all forms of deep learning), you’ll see that they don’t use the word “entropy.” At the same time, we’ve stated that entropy is a fundamental concept in these energy-based neural networks.… Continue reading Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)

Understanding Entropy: Essential to Mastering Advanced AI

Have you been stumped when trying to read the classic AI papers? Are notions such as free energy and entropy confusing? This is probably because ALL areas of advanced AI are based, to some extent, on statistical mechanics. That means that you need to understand some “stat mech” rudiments to get through those papers. One… Continue reading Understanding Entropy: Essential to Mastering Advanced AI

The Great Career-Boosting Reading List: Book #5: “Wild Mercy”

Taking your “sabbaths” – on a regular basis – might be the most important thing that you can do for your overall career success. This week’s YouTube discusses sabbaths, and identifies how many very successful authors point out the need for “sabbath” – or resting from work – is essential to our productivity and to… Continue reading The Great Career-Boosting Reading List: Book #5: “Wild Mercy”

The Great Career-Boosting Reading List: Books 4(a) & 4(b)

Are you making a career transition? This is the situation that many of us are in these days – especially for those who are graduating this week from Northwestern University’s School of Professional Services (SPS) Master of Science in Data Science (MSDS) program. As we move into our career transitions, it is SO HELPFUl to… Continue reading The Great Career-Boosting Reading List: Books 4(a) & 4(b)

Mai Tai Thursdays, Coming Soon!

But first, a decent coffee … Hello, darling – I hope you’re well. The Backyard Roosters of Hawai’i As I write to you, it’s 6:10 AM on the Big Island of Hawai’i, it’s still dark out, and the roosters are in full crow.MULTIPLE roosters. All asserting their very strong, dominant “rooster-ness.”It’s been going on for… Continue reading Mai Tai Thursdays, Coming Soon!

How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

One of the most important things for us to understand, as we come into the “deep learning” aspect of AI (for the first time), is the relationship between backpropagation and the (restricted) Boltzmann machines, which we know comprise the essential core of various “deep learning” architectures. The essential idea in deep architectures is this: Each… Continue reading How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

What It’s Like – Getting Very Close to the Edge (New Scientific Breakthrough) Part 1 (of Several)

Well, the best thing that I’ve got going for me right now is that the offerings on Netflix are so d*** boring that doing research is MUCH more interesting. I’ve read and re-read all my light, fun reading around the house … about three to four times each, over these past two years. My fave… Continue reading What It’s Like – Getting Very Close to the Edge (New Scientific Breakthrough) Part 1 (of Several)

Merry Christmas from Hawai’i!

Poinsettias. They’re the ultimate Christmassy-image of Hawai’i, if you live on the mainland. And back when I lived on the mainland, poinsettias were something that I got at the grocery store or hardware store, starting around Thanksgiving. It wasn’t until I’d moved here … it was that first Christmas season, and I was driving down… Continue reading Merry Christmas from Hawai’i!

Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Latent variables enabled effective energy-based neural networks. The key problem with the Little/Hopfield neural network was its limited memory capacity. This problem was resolved when Hinton, Ackley, and Sejnowski introduced the notion of latent variables, creating the Boltzmann machine. Seven key papers define the evolution of energy-based neural networks. Previously, we examined the first two… Continue reading Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)

Seven key papers provide us with the evolutionary timeline for energy-based neural networks, up through and including deep learning. The timeline for these papers begins with William Little’s 1974 work on the first energy-based neural network, and then John Hopfield’s 1982 expansion on Little’s concepts, up through deep learning architectures as described by Hinton and… Continue reading Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)

Share via
Copy link
Powered by Social Snap