You’re an AI student or professional.
You’ve built things.
You’ve created the “monster stack” of deep learning and CNN (convolutional neural network) layers.
You’ve built some absolutely AH-MAZING, jaw-dropping applications.
And now, you’re ready to dig in … and master the fundamentals.
So you pick up a classic paper – perhaps the classic Salakhutdinov & Hinton (2012) one on deep Boltzmann machines.
And a few minutes into this paper … you’re blown out of the water.
It’s the terminology.
It’s the notation.
It’s the casual use of big concepts from big disciplines … all brought together with grace and finesse … and you realize that you just can’t follow what they’re doing.
This is so not your fault … BUT … you know that it’s something that you need to fix.
So you start drilling in. “Partition function.” “Energy of the state.”
A few hours of Google searches, Wiki reads, tech blog reads, and more … dinner spent watching YouTubes … and all that you feel is drained. Exhausted.
And maybe … even a bit scared.
You know that the way forward involves understanding these terms.
But for all practical purposes, they’re speaking a highly arcane, abstract language … one that you can’t seem to learn on your own.
Sign up for the three-week Top Ten Terms in Statistical Mechanics with the Themesis Academy hosted by Thinkific:
https://www.themesis.thinkific.com
The problem is that there are at least three “languages” involved:
- Statistical mechanics (or statistical physics; the same),
- Bayesian probabilities, and
- The Kullback-Leibler divergence.
Each of these is its own particular “mountain range.”
Together, they make up the Donner Passes of AI.
You can teach yourself Bayesian. (There are some great vids out there.) You can Google your way through Monte Carlo and the other supporting cast topics.
It’s statistical mechanics that is the problem.
You need something that covers enough – of JUST what you need to know – and can be accomplished within this lifetime.
Preferably, accomplished within a few weeks.
You don’t need to master the whole statistical mechanics language.
And you don’t need to read Feynman. (Much though we love the man.)
You just need a phrasebook; a short-and-fast guide – not two years of graduate-level physics.
We’ve got you covered.
The Top Ten Terms in Statistical Physics.
It’s just what it says it is.
The Top Ten Terms is shortest, easiest, fastest route to understanding JUST ENOUGH statistical mechanics to make your way out of the Donner Pass of AI, and into the California Gold Coast country – of understanding energy-based neural networks.
Meaning – you’ll be able to understand what’s going on, from the Hopfield neural network through Hinton’s original work (the Boltzmann machine) up through the most recent work on deep learning, GANs (generalized adversarial networks), and more.
Sign up for the three-week Top Ten Terms in Statistical Mechanics with the Themesis Academy hosted by Thinkific:
https://www.themesis.thinkific.com
You get a kick-off PDF – you can read it in less than an hour, go back to that Salakhutdinov and Hinton paper – and at least not feel lost.
Then, to make your understanding solid – you get three weeks of daily email lessons from us.
- Days 1 – 5 (of each week): You get “core emails,” each taking you to a Content Page specific for that day’s topic. On each Content Page, you’ll get a short video (specific for that day’s topic), a bit of reading, and some “reflection questions.” You’ll also get a SHORT list of extra, optional readings – if you want to dive a bit deeper.
- Day 6 (and Day 13, and Day 20): You get the “Pull It Together” Bonus for that week – another short vid, and a directed exercise – you’ll be asked to go to the core paper that we’re using for this course (Salakhutdinov and Hinton, 2012) and re-read portions of the Intro – and THIS TIME, you’ll understand a bit deeper. This is more of a “hands-on” exercise, and you should see substantial progress on a week-by-week basis.
- Day 7 (and Day 14, and Day 21): A “Final Reflection” day where we invite you to select a concept that you learned during the week and meditate on it. ACTIVELY medidate. (I’ll talk about how I do this in an extra Day 7 Bonus Vid.)
By the end of these three weeks, you will be able to read the Introduction to Salakhutdinov and Hinton (2012), and understand enough to progress to the next section on your own.
This means that you’ll have worked your way out of the Donner Pass of AI, and be able to successfully navigate to the AI “Gold Coast” on your own.
Not saying it’s trivially easy. You are STILL going through the Sierra Nevada mountains of AI.
But now – it will be doable.
And you’ll rest so much easier, knowing that you have the fundamentals.
Basically, you’ll have a map and the crucial landmarks.
Sign up for the three-week Top Ten Terms in Statistical Mechanics with the Themesis Academy hosted by Thinkific:
https://www.themesis.thinkific.com
Money-Back Guarantee: If, after completing this course, you don’t feel that you can make substantially more headway with the great classics in energy-based neural networks – email us. We’ll get you your money back. No questions, no problems.