/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
VOICE ARCHIVE

Sam Lehman

@splehman
3 posts
2025-12-11
Listened to this plus Gavin's ai thoughts post. He seems very confident in pre-training scaling laws holding and I'm just... not so sure? The argument is very focused on advancements in compute pushing pre-training but, definitionally, there needs to be commensurate increases in
2025-12-11 View on X
Invest Like The Best on YouTube

Q&A with investor Gavin Baker of Atreides Management on the economics of AI, data centers in space, mistakes SaaS companies are making in adopting AI, and more

In this episode of Invest Like The Best, Patrick O'Shaughnessy sits down with investor Gavin Baker to explore the rapidly evolving AI landscape.

2025-02-16
So much great stuff in here. Particularly fun is talk of distributed training around (1:02:30), referencing early trials of async training, and scaling async — “as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀
2025-02-16 View on X
Dwarkesh Podcast

Q&A with Google Gemini co-leads Jeff Dean and Noam Shazeer on Google's path to AGI, the future of Moore's Law, TPUs, inference scaling, open research, and more

“as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀 Haider / @slow_developer : Google Chief Scientist, Jeff Dean “AI now ...

2025-02-15
So much great stuff in here. Particularly fun is talk of distributed training around (1:02:30), referencing early trials of async training, and scaling async — “as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀
2025-02-15 View on X
Dwarkesh Podcast

Q&A with Google's Jeff Dean and Noam Shazeer on Google's path to AGI, future of Moore's Law and TPUs, inference scaling, open research, and more

Two of Gemini's co-leads on Google's path to AGI  —  This week I welcome on the show two of the most important technologists ever, in any field.