/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

An Ai2 research scientist says AGI may never emerge because such a concept ignores the physical realities and limits of computation, such as energy constraints

If you are reading this, you probably have strong opinions about AGI, superintelligence, and the future of AI. X: @scaling01 , @sriramk , @tim_dettmers , and @tim_dettmers LinkedIn: Ryan Iyengar and Ali Minai Bluesky: @alexcampolo and @sungkim . Forums: Hacker News X: @scaling01 : I think the ultimate test for AGI is whether AI can debate right now it's fucking terrible at it it keeps moving goalposts and a simple “are you sure” makes it switch positions Sriram Krishnan / @sriramk : Fascinating read on AGI and why we are fundamentally constrained Tim Dettmers / @tim_dettmers : Many people think AI will continue improve towards AGI. In my new blog post, I argue that we will not reach AGI due to physical reasons. Key items discussed: The physical reality of computation Why GPUs will no longer improve Why superintelligence is a fantasy Tim Dettmers / @tim_dettmers : My new blog post discusses the physical reality of computation and why this means we will not see AGI or any meaningful superintelligence: https://timdettmers.com/... LinkedIn: Ryan Iyengar : A solid argument from first principles why the classical definition of AGI won't happen.  —  Recent models are certainly wildly impressive and useful! … Ali Minai : A very interesting contribution to the growing genre of AGI skepticism.  The article makes many important points - especially the physical nature … Bluesky: Alex Campolo / @alexcampolo : “This amplification of bad ideas and thinking exhuded by the rationalist and EA movements, is a big problem in shaping a beneficial future for everyone.” timdettmers.com/2025/12/10/w... Sung Kim / @sungkim : Why AGI Will Not Happen by Tim Dettmers  —  This blog post is for those who want to think more carefully about these claims and examine them from a perspective that is often missing in the current discourse: the physical reality of computation.  —  timdettmers.com/2025/12/10/w... Forums: Hacker News : Why AGI Will Not Happen

Tim Dettmers