AGI is not here … just yet. [*BLOGPOST IN PROGRESS – CHECK BACK FOR DAILY UPDATES!* (AJM, Tuesday, May 14, 2024; 0200 Hawai’i Time)] But – it’s getting close. Very, VERY close. What’s interesting – and important to all of us – is that there is no “singular” AGI, and it’s important to make this… Continue reading Emerging AGIs: Early 2024 Playing Field
Category: A Resource – YouTube
AI, Climate, and Energy (Resource Collection)
One of the most important things in the emerging AI-world is how we handle the ENERGY NEEDS of AI systems. This actually invokes a much bigger question – how will we handle energy needs overall? How do we mitigate (and potentially reverse) the climate crisis? How do we build sustainability and resilience into our energy… Continue reading AI, Climate, and Energy (Resource Collection)
Writing to Get Your Next Job: Five Essential Rules
Whether you’re currently employed, or are actively seeking (as in, job-hunting is your full-time occupation), one of the most important things that you can do is to build your Portfolio. We talked about your Portfolio in this previous blogpost, with examples of how to use GitHub, LinkedIn, and your personal domain as “Portfolio bases.” Your… Continue reading Writing to Get Your Next Job: Five Essential Rules
Brigette Bardot Says It All (An Exercise in the 1D Cluster Variation Method)
Probably best to get the kids out of the room before you play this one. Lots of heavy breathing by Brigette. And you can read the backstory here. (And a bit more here, if you’re so inclined.) But to business … The Starting Point … and the FIRST Illustrative Text String I had previously worked… Continue reading Brigette Bardot Says It All (An Exercise in the 1D Cluster Variation Method)
Pivoting to AGI: What to Read/Watch/Do This Weekend
We are moving from a generative-AI era to an AGI era. What that means – in the simplest technical terms – is that we’re pivoting from “single-algorithm systems” to systems that must – intrinsically – involve multiple major subsystems, and multiple control structures. We’re closing out a 50-year connectionist AI era. This era began with… Continue reading Pivoting to AGI: What to Read/Watch/Do This Weekend
AGI: Google’s Mixture of Depths is a Baby Step Towards Artificial General Intelligence (AGI)
LLMs, mixture-of-depths, mixture-of-experts, MoD, MoE
AGI: Generative AI, AGI, the Future of AI, and You
Generative AI is about fifty years old. There are four main kinds of generative Ai (energy-based neural networks, variational inference, variational autoencoders, and transformers). There are three fundamental methods underlying all forms of generative AI: the reverse Kullback-Leibler divergence, Bayesian conditional probabilities, and statistical mechanics. Transformer-based methods add in multi-head attention and positional encoding. Generative AI is not, and never can be, artificial general intelligence, or AGI. AGI requires bringing in more architectural components, such as ontologies (e.g., knowledge graphs), and a linking mechanism. Themesis has developed this linking mechanism, CORTECONs(R), for COntent-Retentive, TEmporally-CONnected neural networks. CORTECONs(R) will enable near-term AGI development. Preliminary CORTECON work, based on the cluster variation method in statistical mechanics, includes theory, architecture, code, and worked examples, all available for public access. Community participation is encouraged.
CORTECONs: AGI in 2024-2025 – R&D Plan Overview
By the end of 2024, we anticipate having a fully-functional CORTECON (COntent-Retentive, TEmporally-CONnected) framework in place. This will be the core AGI (artificial general intelligence) engine. This is all very straightforward. It’s a calm, steady development – we expect it will all unfold rather smoothly. The essential AGI engine is a CORTECON. The main internal… Continue reading CORTECONs: AGI in 2024-2025 – R&D Plan Overview
CORTECONs and AGI: Reaching Latent Layer Equilibrium
The most important thing in building an AGI is the ability to repeated bring the latent layer to equilibrium. This is the fundamental capability that has been missing in previous neural networks. The lack of a dynamic process to continuously reach a free energy minimum is why we have not had, until now, a robust… Continue reading CORTECONs and AGI: Reaching Latent Layer Equilibrium
AGI Notation: Friston’s Use of “Psi”
We want to create an AGI (artificial general intelligence). If you’re reading this post, we trust that is your intention as well. We already know that AGI won’t come out of transformers. They are, in their essence, content addressable memories. That’s what they can do; that’s ALL that they can do. Our core equation comes… Continue reading AGI Notation: Friston’s Use of “Psi”