Mai Tai Thursdays, Coming Soon!

But first, a decent coffee … Hello, darling – I hope you’re well. The Backyard Roosters of Hawai’i As I write to you, it’s 6:10 AM on the Big Island of Hawai’i, it’s still dark out, and the roosters are in full crow.MULTIPLE roosters. All asserting their very strong, dominant “rooster-ness.”It’s been going on for… Continue reading Mai Tai Thursdays, Coming Soon!

How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

One of the most important things for us to understand, as we come into the “deep learning” aspect of AI (for the first time), is the relationship between backpropagation and the (restricted) Boltzmann machines, which we know comprise the essential core of various “deep learning” architectures. The essential idea in deep architectures is this: Each… Continue reading How Backpropagation and (Restricted) Boltzmann Machine Learning Combine in Deep Architectures

What It’s Like – Getting Very Close to the Edge (New Scientific Breakthrough) Part 1 (of Several)

Well, the best thing that I’ve got going for me right now is that the offerings on Netflix are so d*** boring that doing research is MUCH more interesting. I’ve read and re-read all my light, fun reading around the house … about three to four times each, over these past two years. My fave… Continue reading What It’s Like – Getting Very Close to the Edge (New Scientific Breakthrough) Part 1 (of Several)

Merry Christmas from Hawai’i!

Poinsettias. They’re the ultimate Christmassy-image of Hawai’i, if you live on the mainland. And back when I lived on the mainland, poinsettias were something that I got at the grocery store or hardware store, starting around Thanksgiving. It wasn’t until I’d moved here … it was that first Christmas season, and I was driving down… Continue reading Merry Christmas from Hawai’i!

Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Latent variables enabled effective energy-based neural networks. The key problem with the Little/Hopfield neural network was its limited memory capacity. This problem was resolved when Hinton, Ackley, and Sejnowski introduced the notion of latent variables, creating the Boltzmann machine. Seven key papers define the evolution of energy-based neural networks. Previously, we examined the first two… Continue reading Latent Variables Enabled Effective Energy-Based Neural Networks: Seven Key Papers (Part 2 of 3)

Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)

Seven key papers provide us with the evolutionary timeline for energy-based neural networks, up through and including deep learning. The timeline for these papers begins with William Little’s 1974 work on the first energy-based neural network, and then John Hopfield’s 1982 expansion on Little’s concepts, up through deep learning architectures as described by Hinton and… Continue reading Seven Key Papers for Energy-Based Neural Networks and Deep Learning (Part 1 of 3)