Understanding Entropy: Essential to Mastering Advanced AI

Have you been stumped when trying to read the classic AI papers? Are notions such as free energy and entropy confusing? This is probably because ALL areas of advanced AI are based, to some extent, on statistical mechanics. That means that you need to understand some “stat mech” rudiments to get through those papers. One… Continue reading Understanding Entropy: Essential to Mastering Advanced AI

The Great Career-Boosting Reading List: Book #5: “Wild Mercy”

Taking your “sabbaths” – on a regular basis – might be the most important thing that you can do for your overall career success. This week’s YouTube discusses sabbaths, and identifies how many very successful authors point out the need for “sabbath” – or resting from work – is essential to our productivity and to… Continue reading The Great Career-Boosting Reading List: Book #5: “Wild Mercy”

The Great Career-Boosting Reading List: Books 4(a) & 4(b)

Are you making a career transition? This is the situation that many of us are in these days – especially for those who are graduating this week from Northwestern University’s School of Professional Services (SPS) Master of Science in Data Science (MSDS) program. As we move into our career transitions, it is SO HELPFUl to… Continue reading The Great Career-Boosting Reading List: Books 4(a) & 4(b)

How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

One of the most important things for us to understand, as we come into the “deep learning” aspect of AI (for the first time), is the relationship between backpropagation and the (restricted) Boltzmann machines, which we know comprise the essential core of various “deep learning” architectures. The essential idea in deep architectures is this: Each… Continue reading How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Latent variables enabled effective energy-based neural networks. The key problem with the Little/Hopfield neural network was its limited memory capacity. This problem was resolved when Hinton, Ackley, and Sejnowski introduced the notion of latent variables, creating the Boltzmann machine. Seven key papers define the evolution of energy-based neural networks. Previously, we examined the first two… Continue reading Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)

Seven key papers provide us with the evolutionary timeline for energy-based neural networks, up through and including deep learning. The timeline for these papers begins with William Little’s 1974 work on the first energy-based neural network, and then John Hopfield’s 1982 expansion on Little’s concepts, up through deep learning architectures as described by Hinton and… Continue reading Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)