2025-12-11
Listened to this plus Gavin's ai thoughts post. He seems very confident in pre-training scaling laws holding and I'm just... not so sure? The argument is very focused on advancements in compute pushing pre-training but, definitionally, there needs to be commensurate increases in
Invest Like The Best on YouTube
Q&A with investor Gavin Baker of Atreides Management on the economics of AI, data centers in space, mistakes SaaS companies are making in adopting AI, and more
In this episode of Invest Like The Best, Patrick O'Shaughnessy sits down with investor Gavin Baker to explore the rapidly evolving AI landscape.
2025-02-16
So much great stuff in here. Particularly fun is talk of distributed training around (1:02:30), referencing early trials of async training, and scaling async — “as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀
Dwarkesh Podcast
Q&A with Google Gemini co-leads Jeff Dean and Noam Shazeer on Google's path to AGI, the future of Moore's Law, TPUs, inference scaling, open research, and more
“as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀 Haider / @slow_developer : Google Chief Scientist, Jeff Dean “AI now ...
2025-02-15
So much great stuff in here. Particularly fun is talk of distributed training around (1:02:30), referencing early trials of async training, and scaling async — “as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀
Dwarkesh Podcast
Q&A with Google's Jeff Dean and Noam Shazeer on Google's path to AGI, future of Moore's Law and TPUs, inference scaling, open research, and more
Two of Gemini's co-leads on Google's path to AGI — This week I welcome on the show two of the most important technologists ever, in any field.