/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
Company

Transformer

Filtered to launch pattern ×
10 articles decelerating

Transformer has appeared in 10 articles since 2022-04. Coverage peaked in 2024Q2 with 3 articles. Frequently mentioned alongside Google, Gemini, LLM, Aaron Levie.

Articles
10
mentions
Velocity
-50.0%
growth rate
Acceleration
-1.500
velocity change
Sources
8
publications

Coverage Timeline

2025-10-14
@karpathy 2 related

Andrej Karpathy unveils nanochat, a full-stack training and inference implementation of an LLM in a single, dependency-minimal codebase, deployable in 4 hours

It provides a full ChatGPT-style LLM, including training, inference and a web Ui … X: Clem / @clementdelangue : Am I wrong in sensing a paradigm shift in AI? Feels like we're moving from a world obses...

2025-05-22
Fortune 13 related

Google DeepMind says Gemini Diffusion, an experimental text diffusion model demoed at Google I/O and available by waitlist, generates 1,000-2,000 tokens/second

Our state-of-the-art, experimental text diffusion model Jose Antonio Lanz / Decrypt : Google Doubles Down on AI: Veo 3, Imagen 4 and Gemini Diffusion Push Creative Boundaries Matthias Bastian / The De...

2025-04-01
TechCrunch 26 related

Sam Altman says OpenAI plans to “release a powerful new open-weight language model with reasoning in the coming months”, its first open-weight model since GPT-2

just look at the “T” in ChatGPT, which comes from the Transformer architecture openly shared by Google. Then came Garry Tan / @garrytan : Open weights 🚀 Alexander Doria / @dorialexander : Ok, this one...

2025-02-16
Dwarkesh Podcast 2 related

Q&A with Google Gemini co-leads Jeff Dean and Noam Shazeer on Google's path to AGI, the future of Moore's Law, TPUs, inference scaling, open research, and more

“as we scale up [training], there may be a push to have a bit more asynchrony in our systems than we do now” 👀 Haider / @slow_developer : Google Chief Scientist, Jeff Dean “AI now generates 25% of Goo...

2022-04-06
Google AI Blog 6 related

Google AI claims PaLM, its 540B parameter, dense decoder-only Transformer model, shows breakthrough capabilities in tasks like language, reasoning, and coding

In recent years, large neural networks trained for language understanding and generation have achieved impressive results across a wide range of tasks.

Loading articles...

Quarterly Coverage

Top Sources

Narrative

TEXXR tracks 31 Techmeme articles mentioning Transformer, dating back to January 2015. The biggest stories include Sources: Amazon's ZeroOne unit is developing “Transformer”, a phone that syncs with... and Nvidia debuts Nemotron-Nano-9B-v2, a hybrid Mamba-Transformer model, saying it achieves.... Frequently covered alongside Google, Nvidia, Apple, Asus, and Robotics Transformer 2. Coverage has shifted toward funding themes and away from consumer, research.

Key Moments

2024Q2funding +100pts
2024Q4consumer +60pts; research -20pts; funding -100pts
2025Q1consumer +40pts; research +20pts; funding +100pts

Relationships

Loading graph...