AJM Note: ALTHOUGH NEARLY COMPLETE, references and discussion still being added to this blogpost. This note will be removed once the blogpost is completed – anticipated over the Memorial Day weekend, 2023. Note updated 12:30 AM, Hawai’i Time, Tuesday, May 29, 2023. This blogpost accompanies a YouTube vid on the same topic of the Evolution… Continue reading Evolution of NLP Algorithms through Latent Variables: Future of AI (Part 3 of 3)
Tag: Geoffrey Hinton
New Neural Network Class: Framework: The Future of AI (Part 2 of 3)
We want to identify how and where the next “big breakthrough” will occur in AI. We use three tools or approaches to identify where this next big breakthrough will occur: The Quick Overview Get the quick overview with this YouTube #short: The Full YouTube Maren, Alianna J. 2023. “A New Neural Network Class: Creating the… Continue reading New Neural Network Class: Framework: The Future of AI (Part 2 of 3)
Variational Free Energy and Active Inference: Pt 1
Overarching Story Line This new blogpost series, on variational free energy and active inference, presents tutorial-level studies centered on the free energy equation (Eq. 2.7) of Karl Friston’s 2013 paper, “Life as We Know It.” Specifically, we’re focused on the free energy equation shown in Figure 1 below. Over this blogpost series, we will reinforce… Continue reading Variational Free Energy and Active Inference: Pt 1
When a Classifier Acts as an Autoencoder, and an Autoencoder Acts as a Classifier (Part 1 of 3)
One of the biggest mental sinkholes into which AI students can get trapped is not quite understanding the fundamental difference between how our two basic “building block” networks operate: the Multilayer Perceptron (MLP), trained with backpropagation (or any form of gradient descent learning), and the (restricted) Boltzmann machine (RBM), trained with contrastive divergence. It’s easy… Continue reading When a Classifier Acts as an Autoencoder, and an Autoencoder Acts as a Classifier (Part 1 of 3)
Strategizing Your Research Project: Developing Your Portfolio Elements
When life gets hard, and you still have to get your research project done, sometimes it’s hard to focus and figure out the “most important thing.” Blog Update: How to Get More Focus and More Time Dear All – this is an update, added Sept. 22, 2022. Building your Portfolio is a BIG THING. Just… Continue reading Strategizing Your Research Project: Developing Your Portfolio Elements
Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)
If you’ve looked at some classic papers in energy-based neural networks (e.g., the Hopfield neural network, the Boltzmann machine, the restricted Boltzmann machine, and all forms of deep learning), you’ll see that they don’t use the word “entropy.” At the same time, we’ve stated that entropy is a fundamental concept in these energy-based neural networks.… Continue reading Entropy in Energy-Based Neural Networks – Seven Key Papers (Part 3 of 3)