AI, Climate, and Energy (Resource Collection)

One of the most important things in the emerging AI-world is how we handle the ENERGY NEEDS of AI systems. This actually invokes a much bigger question – how will we handle energy needs overall? How do we mitigate (and potentially reverse) the climate crisis? How do we build sustainability and resilience into our energy… Continue reading AI, Climate, and Energy (Resource Collection)

It Might All Come Down to Rare Earths

Jensen Huang’s keynote talk at NVIDIA GTC last week was very likely the tip of the iceberg. Demand for processing units is going up. Going CRAZY up. NVIDIA’s new product releases and recent stock price upsurges reflect that. But NVIDIA is not the only chip-maker in the US. The Biden Administration has been investing …… Continue reading It Might All Come Down to Rare Earths

AGI: Generative AI, AGI, the Future of AI, and You

Generative AI is about fifty years old. There are four main kinds of generative Ai (energy-based neural networks, variational inference, variational autoencoders, and transformers). There are three fundamental methods underlying all forms of generative AI: the reverse Kullback-Leibler divergence, Bayesian conditional probabilities, and statistical mechanics. Transformer-based methods add in multi-head attention and positional encoding. Generative AI is not, and never can be, artificial general intelligence, or AGI. AGI requires bringing in more architectural components, such as ontologies (e.g., knowledge graphs), and a linking mechanism. Themesis has developed this linking mechanism, CORTECONs(R), for COntent-Retentive, TEmporally-CONnected neural networks. CORTECONs(R) will enable near-term AGI development. Preliminary CORTECON work, based on the cluster variation method in statistical mechanics, includes theory, architecture, code, and worked examples, all available for public access. Community participation is encouraged.

CORTECONs: AGI in 2024-2025 – R&D Plan Overview

By the end of 2024, we anticipate having a fully-functional CORTECON (COntent-Retentive, TEmporally-CONnected) framework in place. This will be the core AGI (artificial general intelligence) engine. This is all very straightforward. It’s a calm, steady development – we expect it will all unfold rather smoothly. The essential AGI engine is a CORTECON. The main internal… Continue reading CORTECONs: AGI in 2024-2025 – R&D Plan Overview

AGI Notation: Friston’s Use of “Psi”

We want to create an AGI (artificial general intelligence). If you’re reading this post, we trust that is your intention as well. We already know that AGI won’t come out of transformers. They are, in their essence, content addressable memories. That’s what they can do; that’s ALL that they can do. Our core equation comes… Continue reading AGI Notation: Friston’s Use of “Psi”

Evolution of NLP Algorithms through Latent Variables: Future of AI (Part 3 of 3)

AJM Note: ALTHOUGH NEARLY COMPLETE, references and discussion still being added to this blogpost. This note will be removed once the blogpost is completed – anticipated over the Memorial Day weekend, 2023. Note updated 12:30 AM, Hawai’i Time, Tuesday, May 29, 2023. This blogpost accompanies a YouTube vid on the same topic of the Evolution… Continue reading Evolution of NLP Algorithms through Latent Variables: Future of AI (Part 3 of 3)

New Neural Network Class: Framework: The Future of AI (Part 2 of 3)

We want to identify how and where the next “big breakthrough” will occur in AI. We use three tools or approaches to identify where this next big breakthrough will occur: The Quick Overview Get the quick overview with this YouTube #short: The Full YouTube Maren, Alianna J. 2023. “A New Neural Network Class: Creating the… Continue reading New Neural Network Class: Framework: The Future of AI (Part 2 of 3)

Kuhnian Normal and Breakthrough Moments: The Future of AI (Part 1 of 3)

Over the past fifty years, there have only been a few Kuhnian “paradigm shift” moments in neural networks. We’re ready for something new!

The Future of AI: Part 0 (Prelude) – Reductio ad Absurdum

A few weeks ago, I went to the Lihue-based Kaua’i Farmer’s Market for locally-grown fresh veggies. And, of course, I swung by the Kaua’i Master Gardeners booth. Now, I’d visited with these guys before. And yes, they knew that I taught AI at Northwestern. But still, my jaw dropped when the first question that one… Continue reading The Future of AI: Part 0 (Prelude) – Reductio ad Absurdum