A look at the state of AI agents, the evolution of thinking models, the staggering need for inference compute in the coming years, automated research, and more
— Dr. Vannevar Bush, As We May Think, 1945 — If we consider life to be a sort of open-ended MMO, the game server has just received a major update.
Evjang.com Eric Jang
Related Coverage
- Weekly Dose of Optimism #179 Not Boring · Packy McCormick
Discussion
-
@andrewmccalip
Andrew McCalip
on x
This was a great piece by Eric, and I'm in total agreement. The amount of intelligence consumption per capita is about to explode. We're going to be inference capacity limited for a long, long time. https://evjang.com/... [image]
-
@kevinroose
Kevin Roose
on x
This is great, and the bits about inference demand ring true. It's trivially easy to generate 100ks of tokens in a single session with a coding agent (sometimes a single prompt!) and I'm not even using this stuff professionally. We're gonna need a lot more data centers.
-
@_sholtodouglas
Sholto Douglas
on x
100% agree with his conclusions - Eric consistently predicts where the field is going.
-
@ericjang11
Eric Jang
on x
While re-reading As We May Think I thought it would be fun to show some of the side-by-sides of inventions Vannevar Bush predicted in 1945 compared to their modern instantiation of information processing systems. 1/4 [image]
-
@aarthir
Aarthi Ramamurthy
on x
“People who can direct teams of agents at goals and know how to judge what to focus on in a full-stack scope will experience an exhilarating level of productivity that makes making software a joy again. For roboticists: there is the age-old question of how much we should rely on
-
@deredleritt3r
Prinz
on x
We are probably building too few data centers [image]
-
@nabeelqu
Nabeel S. Qureshi
on x
Everyone underestimates just how much inference we will need. Excellent essay: [image]
-
@jenniferhli
Jennifer Li
on x
Good read. And a powerful comparison. “Based on my own usage patterns, it's beginning to dawn on me how much inference compute we will need in the coming years. I don't think people have begun to fathom how much we will need. Even if you think you are AGI-pilled, I think you are
-
@altimor
Flo Crivello
on x
Hard agree with this point. People underestimate how much inference we will need by many, many orders of magnitude. [image]
-
@gbrl_dick
Gabriel
on x
incredible essay — the first half explains structural advances in LLMs very clearly, but i was struck by this towards the end. as models get better at longer horizon tasks, inference demand obviously goes up because longer reasoning uses more tokens but what i hadn't really [imag…
-
@ericjang11
Eric Jang
on x
As Rocks May Think: an interactive essay on thinking models, automated research, and where I think they are headed. Enjoy! https://evjang.com/...
-
@deanwball
Dean W. Ball
on x
Recently I did some back-of-the-envelope math and realized that my heightened use of coding agents in the last few months means that my daily token in/out consumption has probably increased by ~2 *orders of magnitude,* and I was a heavy user of reasoning chatbots before.
-
@mascobot
Marco Mascorro
on x
This is a great read from Eric. I feel the same way in many ways, and it resonates a lot with the conversations with researcher friends I've had over the last few months. The world has changed a lot since 2022 (since the release of ChatGPT), and even more in the last year. If
-
@drivelinekyle
Kyle Boddy
on x
A lot to grasp from this section if you're someone who likes learning stuff. [image]
-
@bilaltwovec
Bilal
on x
i no longer launch any of my own jobs its glorious not waking up to run that ended up useless because you made a mistake in the yaml [image]
-
@danielrock
Daniel Rock
on x
Fascinating and thought-provoking essay. I'd like to think we Rocks at least sometimes thought before, but also sometimes in our family there is evidence to the contrary.