Writing to Get Your Next Job: Five Essential Rules

Whether you’re currently employed, or are actively seeking (as in, job-hunting is your full-time occupation), one of the most important things that you can do is to build your Portfolio. We talked about your Portfolio in this previous blogpost, with examples of how to use GitHub, LinkedIn, and your personal domain as “Portfolio bases.” Your… Continue reading Writing to Get Your Next Job: Five Essential Rules

Brigette Bardot Says It All (An Exercise in the 1D Cluster Variation Method)

Probably best to get the kids out of the room before you play this one. Lots of heavy breathing by Brigette. And you can read the backstory here. (And a bit more here, if you’re so inclined.) But to business … The Starting Point … and the FIRST Illustrative Text String I had previously worked… Continue reading Brigette Bardot Says It All (An Exercise in the 1D Cluster Variation Method)

Pivoting to AGI: What to Read/Watch/Do This Weekend

We are moving from a generative-AI era to an AGI era. What that means – in the simplest technical terms – is that we’re pivoting from “single-algorithm systems” to systems that must – intrinsically – involve multiple major subsystems, and multiple control structures. We’re closing out a 50-year connectionist AI era. This era began with… Continue reading Pivoting to AGI: What to Read/Watch/Do This Weekend

Building Your Online Portfolio (A Collection of Useful Links)

One of the strongest things that we can do to position ourselves – for the next career move, and also for creating a new tier of powerful professional relations – is to build our online Portfolio. This post provides links to good Portfolio examples for three different cases: This post provides a collection of useful… Continue reading Building Your Online Portfolio (A Collection of Useful Links)

It Might All Come Down to Rare Earths

Jensen Huang’s keynote talk at NVIDIA GTC last week was very likely the tip of the iceberg. Demand for processing units is going up. Going CRAZY up. NVIDIA’s new product releases and recent stock price upsurges reflect that. But NVIDIA is not the only chip-maker in the US. The Biden Administration has been investing …… Continue reading It Might All Come Down to Rare Earths

AGI: Generative AI, AGI, the Future of AI, and You

Generative AI is about fifty years old. There are four main kinds of generative Ai (energy-based neural networks, variational inference, variational autoencoders, and transformers). There are three fundamental methods underlying all forms of generative AI: the reverse Kullback-Leibler divergence, Bayesian conditional probabilities, and statistical mechanics. Transformer-based methods add in multi-head attention and positional encoding. Generative AI is not, and never can be, artificial general intelligence, or AGI. AGI requires bringing in more architectural components, such as ontologies (e.g., knowledge graphs), and a linking mechanism. Themesis has developed this linking mechanism, CORTECONs(R), for COntent-Retentive, TEmporally-CONnected neural networks. CORTECONs(R) will enable near-term AGI development. Preliminary CORTECON work, based on the cluster variation method in statistical mechanics, includes theory, architecture, code, and worked examples, all available for public access. Community participation is encouraged.

If You’re Teaching Yourself Generative AI: Some Resources (Book Chapters)

Teaching yourself generative AI (“gen-AI”) has to be one of the hardest things in the world. The really important, classic papers – the ones that you WISH THAT YOU COULD READ – all presuppose that you have a lot of knowledge coming in, about all SORTS of things. The situation is the same as it… Continue reading If You’re Teaching Yourself Generative AI: Some Resources (Book Chapters)