CORTECONs: AGI in 2024-2025 – R&D Plan Overview

By the end of 2024, we anticipate having a fully-functional CORTECON (COntent-Retentive, TEmporally-CONnected) framework in place. This will be the core AGI (artificial general intelligence) engine. This is all very straightforward. It’s a calm, steady development – we expect it will all unfold rather smoothly. The essential AGI engine is a CORTECON. The main internal… Continue reading CORTECONs: AGI in 2024-2025 – R&D Plan Overview

CORTECONs and AGI: Reaching Latent Layer Equilibrium

The most important thing in building an AGI is the ability to repeated bring the latent layer to equilibrium. This is the fundamental capability that has been missing in previous neural networks. The lack of a dynamic process to continuously reach a free energy minimum is why we have not had, until now, a robust… Continue reading CORTECONs and AGI: Reaching Latent Layer Equilibrium

Learning Energy-Based Neural Networks

In order to read any of the classic (and important) papers on energy-based neural networks, we need to know the vocabulary and essential concepts from: In today’s associated YouTube video, we illustrate how these different terms – and their respective disciplines – are blended together, using the Salakhutdinov and Hinton (2012) paper as a reference… Continue reading Learning Energy-Based Neural Networks

AGI Notation: Friston’s Use of “Psi”

We want to create an AGI (artificial general intelligence). If you’re reading this post, we trust that is your intention as well. We already know that AGI won’t come out of transformers. They are, in their essence, content addressable memories. That’s what they can do; that’s ALL that they can do. Our core equation comes… Continue reading AGI Notation: Friston’s Use of “Psi”

Socrates, Pericles, and Aspasia: How A Muse Helped the Genesis of Western Thought

Socrates is a name that most of us know, even if only tangentially. We’ve probably all heard of the “Socratic method” of teaching. Many of us also know of Pericles, the great Athenian general who lead their armies in their various wars – including those against Sparta. But Aspasia? Not as many of us have… Continue reading Socrates, Pericles, and Aspasia: How A Muse Helped the Genesis of Western Thought

The 1D CVM (Cluster Variation Method): Complete Interactive Code (Part 2)

The most important element in creating an AGI (artificial general intelligence) is that the latent node layer needs to allow a range of neural dynamics. The most important of these dynamics will be the ability for the system to rapidly undergo a state change, from mostly “off” nodes to mostly “on.” Neurophysiologists have observed this… Continue reading The 1D CVM (Cluster Variation Method): Complete Interactive Code (Part 2)

Next-Era AGI: Neurophysiology Basis

The next era of artificial intelligence (AI), or artificial general intelligence (AGI), will rest on neurophysiology models that emphasize neuronal group dynamics, rather than the behaviors of single neurons. This post addresses three questions that underlie the next-era neurophysiological underpinnings supporting neural networks – the NEXT generation of neural networks modeling: Neurophysiology: Important Early Works… Continue reading Next-Era AGI: Neurophysiology Basis

1-D Cluster Variation Method: Simple Text String Worked Example (Part 1)

Today, we focus on getting the entropy term in the 1-D cluster variation method (the 1D CVM), using a simple text string as the basis for our worked example. This blog is in-progress. Please check back tomorrow for the updated version. Thank you! – AJM Our End Goal Our end goal – the reason that… Continue reading 1-D Cluster Variation Method: Simple Text String Worked Example (Part 1)