A look at the state of AI agents, the evolution of thinking models, the staggering need for inference compute in the coming years, automated research, and more
Evjang.com Eric Jang
Related Coverage
- Weekly Dose of Optimism #179 Not Boring · Packy McCormick
Discussion
-
@andrewmccalip
Andrew McCalip
on x
This was a great piece by Eric, and I'm in total agreement. The amount of intelligence consumption per capita is about to explode. We're going to be inference capacity limited for a long, long time. https://evjang.com/... [image]
-
@_sholtodouglas
Sholto Douglas
on x
100% agree with his conclusions - Eric consistently predicts where the field is going.
-
@kevinroose
Kevin Roose
on x
This is great, and the bits about inference demand ring true. It's trivially easy to generate 100ks of tokens in a single session with a coding agent (sometimes a single prompt!) and I'm not even using this stuff professionally. We're gonna need a lot more data centers.
-
@ericjang11
Eric Jang
on x
While re-reading As We May Think I thought it would be fun to show some of the side-by-sides of inventions Vannevar Bush predicted in 1945 compared to their modern instantiation of information processing systems. 1/4 [image]
-
@aarthir
Aarthi Ramamurthy
on x
“People who can direct teams of agents at goals and know how to judge what to focus on in a full-stack scope will experience an exhilarating level of productivity that makes making software a joy again. For roboticists: there is the age-old question of how much we should rely on
-
@deredleritt3r
Prinz
on x
We are probably building too few data centers [image]
-
@nabeelqu
Nabeel S. Qureshi
on x
Everyone underestimates just how much inference we will need. Excellent essay: [image]
-
@jenniferhli
Jennifer Li
on x
Good read. And a powerful comparison. “Based on my own usage patterns, it's beginning to dawn on me how much inference compute we will need in the coming years. I don't think people have begun to fathom how much we will need. Even if you think you are AGI-pilled, I think you are
-
@altimor
Flo Crivello
on x
Hard agree with this point. People underestimate how much inference we will need by many, many orders of magnitude. [image]
-
@gbrl_dick
Gabriel
on x
incredible essay — the first half explains structural advances in LLMs very clearly, but i was struck by this towards the end. as models get better at longer horizon tasks, inference demand obviously goes up because longer reasoning uses more tokens but what i hadn't really [imag…
-
@ericjang11
Eric Jang
on x
As Rocks May Think: an interactive essay on thinking models, automated research, and where I think they are headed. Enjoy! https://evjang.com/...
-
@deanwball
Dean W. Ball
on x
Recently I did some back-of-the-envelope math and realized that my heightened use of coding agents in the last few months means that my daily token in/out consumption has probably increased by ~2 *orders of magnitude,* and I was a heavy user of reasoning chatbots before.