/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

In recent interviews, Sam Altman said AI's adoption faces more resistance than he expected, while Jensen Huang warned the “doomer narrative” may be winning

Tech leaders are beginning to worry about the public's underwhelming enthusiasm for their plans to remake the world with artificial intelligence.LinkedIn:Bridget FahrlandandDaron Yondem

New York Times David Streitfeld

Discussion

  • @carljanderson @carljanderson on bluesky
    Let me say this again..  —  GenAI is technology for technology's sake.  There is no whiz bang use case that makes any of it worth it.  I say this as someone who has been in tech for 30+ years.  —  It was easy to sell VMware.  It's easy to sell cloud.  —  This.. what's the use cas…
  • @rosemarymosco.com Rosemary Mosco on bluesky
    My dearly departed dad was quoted in the NY Times today.  I miss him and his wisdom so much www.nytimes.com/2026/02/21/t...  [image]
  • @prietschka Paul Rietschka on bluesky
    I was around for the Dotcom era, this is no Dotcom era.  —  In the Dotcom bubble everything was new, and the Internet felt like wide frontier everyone could have a stake in.  There was a sense of promise, as if we were on the cusp of something big and new (and we fucking were).  …
  • @mweinbach Max Weinbach on x
    This alone may have lost my trust in Sama to build a good AI company I understand the point he's trying to make, but this is trying to break down people and models into cost for output and ignoring the value of humanity itself It's a bad path imo
  • @effiebio Effie Klimi on x
    I assume OpenAI holds some kind of internal competition for worst comms idea in terms of likely perception by the general public and the winner gets to have their suggestions spoken out by sama
  • @realbalajee @realbalajee on x
    Sam is simply rebutting a common argument made against the efficiency of AI models. While I don't think he is right because humans have a higher capacity to learn / become more efficient over time than a trained AI models, I find the moral pearl clutching to be quite dumb.
  • @edels0n Ed Elson on x
    OpenAI should put Sam wherever the Biden 2020 team put Joe
  • @marxicology Laura Palmer Raids on x
    If you understand this thinking in the context of climate change, you can see how these guys are telegraphing the arguments that they'll be making to consign whole parts of the population to absolute misery and suffering.
  • @nc_renic Neil Renic on x
    I guess this view, that humans are energy-inefficient, near-obsolete, computers, is easier when you've had no contact with their other outputs, like friendship, humour, and love.
  • @jerseyh0mo Davey on x
    The entire tech industry is run by creepy, antisocial psychopaths who don't care about humanity. These ghouls possess zero empathy. And they're so unfathomably rich that they can buy politicians to pass laws on their behalf. They are the greatest danger to our species and planet.
  • @luizajarovsky Luiza Jarovsky, PhD on x
    🚨 Something IMPORTANT is happening here: Notice how many people felt deeply uncomfortable with his statements. Many haven't realized it, but a new form of morality is emerging: one in which treating humans as expendable, worthless, or less than machines is wrong. This is a
  • @jimstewartson @jimstewartson on x
    Sam Altman needs an atomic wedgie, not a trillion dollar company. This is wrong, absolutely amoral, and total self-own. This is what happens when Peter Thiel picks your CEO. You get a clueless misanthrope who will do and say anything for money. PSA: Chatbots are over.
  • @kapoorkkunal Kunal Kapoor on x
    I'm very pro tech and his intent here is to talk about energy, but you can see why people are becoming increasingly uncomfortable with the tech bros. It reflects a broader discomfort in how they frame humanity.  Culturally, some of this language starts to echo the archetype of th…
  • @sketchesbyboze @sketchesbyboze on x
    You need to understand that Silicon Valley is increasingly being run by creepy, dead-eyed men who hate families, who hate kids, who have no respect for life, who would like to see much of humanity dead or enslaved. It's our job to defeat the dystopia they're so eager to build.
  • @davidbessis David Bessis on x
    Why does Sam insist on declaring war against his entire customer base? That's a truly bizarre positioning for a mainstream business.
  • @melatkirosco @melatkirosco on x
    This is how disposable we are to the billionaires developing AI. That's why I'm calling for regulations on AI and the data centers that are straining our energy grids. There's too much at stake with this technology, and we need to make sure AI benefits us all, not just the few.
  • @kortizart Karla Ortiz on x
    This is: 1. BS from a snake oil salesman 2. Psychotic 3. A position based on so much disdain for you and your future. Their proposal is to take everything away from you: opportunities, livelihoods, environment. We gotta legally rein in these people and their companies asap
  • @toadsanime Ryan T. Brown on x
    This is an evil, warped worldview from an evil, warped man who has lost his humanity and doesn't see people as any different to machines with a work/financial purpose. We are inefficient tools to the inhuman cretin. He can't be fixed. The world is worse for his existence.
  • @charlottealter Charlotte Alter on x
    there is something so fundamentally gross abt this
  • @rosiegray Rosie Gray on x
    Are the AI people trying on purpose to come across as completely sinister
  • @aakashgupta Aakash Gupta on x
    A human consumes about 2,000 calories per day.  Over 20 years, that's roughly 17,000 kWh of total food energy.  Training GPT-4 consumed an estimated 50 GWh of electricity.  That's 3,000 humans worth of “training energy” for a single model run.  And GPT-4 is already dead.  OpenAI …
  • @seanilling Sean Illing on x
    I simply cannot express how passionately I dislike these people
  • @boringbiz_ @boringbiz_ on x
    I honestly think the reason we see so much main street pushback against AI is because of how the leaders of the largest companies in the space have framed the technology If they had just said they were building cool technology that helps people be more productive, it would be
  • @framesofnick Nick on x
    This is the talk of a traitor to the human race
  • @xriskology Dr. Émile P. Torres on x
    “To train a human.” Altman has previously said that people are nothing more than “stochastic parrots.” He holds a profound dehumanizing view of our species.
  • @christosargyrop @christosargyrop on x
    Let's fact check Sammie here. 25 years x 365 days = 9,125 days Caloric needs vary by age; let's assume 2500 on average ~ 23M calories ~ 27 kilowatt-hours to raise a human. A gaming GPU is about 0.4 watts/hr. So playing games for 3 days straight consumes more energy than raising
  • @tanya_sabrinaaa Tanya on x
    we should stop feeding children so we can feed data centers
  • @snmrrw Sean Morrow on x
    He's danced around this before, but this seems like the most direct admission that he's anti-humanity, a traitor against us all who should be treated as such.
  • @shauseth Shaurya on x
    calling humans meat computers that eat too much food was a completely optional way to address the issue btw
  • @heyaimsarah Sarah on x
    America's treasure is her natural resources. These untouched lands that belong to us, the public. Don't let them trade this for a bunch of fucking data centers [image]
  • @nomads4pritzker @nomads4pritzker on x
    Famously, human learning takes *very little* energy. A toddler learns thousands of words and an entire grammar while eating nothing but raspberries, before learning to read. An LLM needs a nuclear reactor and the entire internet as a training corpus.
  • @tompark1n Tom Parkin on x
    Humans are a waste of life and food energy, says AI billionaire
  • @mmitchell_ai @mmitchell_ai on x
    A problem, it seems to me, is that many people don't see or experience people's humanity. #NotAllPeople. If you don't experience others' humanity, then reducing them to computer-like processing is completely sensible. 1/
  • @brianbeutler Brian Beutler on x
    Psychotic.
  • @koylanai @koylanai on x
    I build AI for a living. I believe in what we're building. But this kind of rhetoric makes my work harder and more dangerous. @sama, comparing human development to model training is tone-deaf, strategically reckless. People are losing jobs. They're getting angry. They're
  • @svembu Sridhar Vembu on x
    I do not want to see a world where we equate a piece of technology to a human being. I work hard as a technologist to see a world where we don't allow technology to dominate our lives, instead it should quietly recede into the background.
  • @bigblackjacobin Edward Ongweso Jr on x
    we can write a million essays about how the future Silicon Valley wants to build is underwritten by a deep disgust with / contempt for Being A Human, or we can just let them speak for themselves
  • @rob_flaherty Rob Flaherty on x
    One of the fundamental problems with “just let the AI guys do whatever they want” is that the AI guys seem to have never interacted with a real person in their entire lives https://x.com/...
  • @topherspiro Topher Spiro on x
    These people need to be regulated.
  • @annaciaunica @annaciaunica on x
    Looks like our paper with @erikjbekkers is more timely than ever ! If you had to chose between unplugging an AI claiming it's conscious and or a pre-term incubator - which one would you choose ? https://arxiv.org/... [image]
  • @david_fairchild L. David Fairchild on x
    He's not just defending AI energy use.  He is smuggling in a whole anthropology where humans are basically inefficient meat computers that you have to pour food and years into before they become useful...
  • @arzandc Arzan Tarapore on x
    This is why we need philosophers, not only coding nerds.
  • @samhaselby Sam Haselby on x
    This is how industrial agriculture talks about livestock.
  • @alexbores Alex Bores on x
    Computer scientist and new dad here. I'm raising a child because that's the joy of life. I'm running for Congress because my son—all of our kids—deserve AI that enriches their lives and works for them, not against them. Altman's mission seems to be unfettered AI at all of our
  • @gmiller Geoffrey Miller on x
    Conservatives need to understand that the AI industry's foundational ideology is contempt for humans, and the total confidence that they can replace humans with entities that are more ‘efficient’. If you support that, you're not a ‘conservative’ in any meaningful way.
  • @matthewstoller Matt Stoller on x
    He's saying a really big spreadsheet and a baby are morally equivalent. One reason to believe that life is divine is so that you don't allow sociopaths like this anywhere near anything important.
  • @mollysoshea Molly O'Shea on x
    Yes, Sam — but brains are far more energy efficient than computers. Brain: 20 watts AI: gigawatts Biology wins. Naveen Rao (@NaveenGRao) CEO, @unconvAI Konstantine Buhler (@Konstantine) Partner, @sequoia TLDR: Biology currently delivers far more general intelligence per [video]
  • @motorhueso Eugenio Tisselli on x
    I don't know how much energy was used to train Altman, but it clearly was a huge waste.
  • @alex_peys Alex Peysakhovich on x
    ~3000 calories per day = 22m calories in 20 years =~ 25k kw - assume 10x inefficiency in production = 250k kw ~ assume 10 kw per hour to run a 8x H100 machine = 25k*8 = 200k gpu hours llama3 8b took 1.3m gpu hours (~6 people), 405b took 31m (~150 people) this doesn't count all
  • @quinnslobodian.com Quinn Slobodian on bluesky
    Speed of gestation/maturation has been a problem for the tech-eugenicists for some time, laid out at length in Bostrom's Superintelligence (2014) [embedded post]
  • @metacurity.com Cynthia Brumfield on bluesky
    techcrunch.com/2026/02/21/s...  Sam Altman would like you to know that AI is going to use only the energy of the workers it displaces once we starve those workers to death so we can use their energy to fuel AI or something to that effect.
  • @tcarmody Tim Carmody on bluesky
    He really thought he had a winner with this one.  “Check and mate!” [embedded post]
  • @fatraccoon @fatraccoon on bluesky
    Man with above average intelligence easily steps into logical fallacy trap.  —  “But it also takes a lot of energy to train a human,” Altman said.  “It takes like 20 years of life and all of the food you eat during that time before you get smart...”  —  techcrunch.com/2026/02/21/…
  • @jfarkas Johan Farkas on bluesky
    When tech companies anthropomorphize AI, they also dehumanise people.  Case in point: Sam Altman just said that we shouldn't worry about the climate impact of AI because humans eat food.  —  techcrunch.com/2026/02/21/s...
  • @harfenist Ethan Harfenshvitz on bluesky
    Being confident enough to say something this ghoulish to an audience should automatically get you investigated for potentially abusing employees.  —  This dude clearly doesn't value human life and looks at the rest of us as meat for his grinder
  • @seanokane Sean O'Kane on bluesky
    My god this is genuinely one of the dumbest arguments I've ever heard  —  techcrunch.com/2026/02/21/s...
  • @obyrska Olga Byrska on bluesky
    Meanwhile, in our dystopia: “People talk about how much energy it takes to train an AI model ... But it also takes a lot of energy to train a human.  It takes like 20 years of life and all of the food you eat during that time before you get smart.”  Sam Altman
  • @Edelruth@mastodon.online @Edelruth@mastodon.online on mastodon
    @Techmeme  —  Another anthropomorphization of the LLM: now it is ‘raised’, and comparable to a child.
  • @gerrymcgovern@mastodon.green Gerry McGovern on mastodon
    Sam Altman would like remind you that humans use a lot of energy, too  —  Altman — who was in India for a major AI summit — said concerns about AI's water usage are “totally fake,” though he acknowledged it was a real issue when “we used to do evaporative cooling in data centers.…
  • r/behindthebastards r on reddit
    Sam Altman would like to remind you that humans use a lot of energy, too
  • r/BetterOffline r on reddit
    Sam Altman would like remind you that humans use a lot of energy, too
  • r/technology r on reddit
    Sam Altman would like remind you that humans use a lot of energy, too
  • @kenroth Kenneth Roth on x
    A.I. regulation is one of the few issues that a divided America seems united on. Eighty percent of Americans want rules for A.I. even if that means the technology develops more slowly, according to a Gallup survey last spring. But not Trump. https://www.nytimes.com/...
  • @martahari Marta Bulaich on x
    Is the AI doomer narrative winning? https://www.nytimes.com/...
  • @christinayiotis @christinayiotis on x
    “Tech leaders are beginning to worry about the public's underwhelming enthusiasm for their plans to remake the world with artificial intelligence. Will that burst the bubble?” https://www.nytimes.com/... @nytimes
  • @patriciagbarnes Patricia G. Barnes on x
    The poster boy for the AI movement, Sam Altman, seems deeply troubled and pathologically self-serving, as do many of his counterparts (Meta, Palantir, Google). Americans see this. ... People Loved the Dot-Com Boom. The A.I. Boom, Not So Much. https://www.nytimes.com/...
  • @misterjayem @misterjayem on bluesky
    “Jensen Huang, chief executive of the chip maker Nvidia, is worried.  The tech industry hype may seem omnipresent, but Mr. Huang feels ‘the battle of narratives’ is being won by the critics.  —  “'It's extremely hurtful, frankly,' Mr. Huang said”
  • @brianralexander @brianralexander on bluesky
    Let us tell you how you will live.  —  Let us consume your creative life for our profit.  —  Let us shape your society.  —  Let us be your government.  —  Why don't you love and admire us?  —  www.nytimes.com/2026/02/21/t...
  • @anthonyha Anthony Ha on bluesky
    I just keep reading these quotes and cackling  —  www.nytimes.com/2026/02/21/t...  [image]
  • @roopikarisam Roopika Risam on bluesky
    “The creators of a new technology have always sold it as producing a fundamental transformation of human existence.”
  • r/technology r on reddit
    Sam Altman would like to remind you that humans use a lot of energy, too
  • @karlbode.com Karl Bode on bluesky
    even this NYT article ostensibly about people being resistant to AI doesn't REALLY want to tangle with the long list of reasons why (pollution, dismantled climate goals, the fact surveillance obsessed tech companies saddled up to fascism).  They're given a throwaway sentence or t…
  • @mims Christopher Mims on bluesky
    I cannot emphasize this enough: The AI doomer narrative (it's going to wake up and kill us all) is, whether its adherents intend it or not, pure marketing for AI companies  —  It's one reason their CEOs have even, on occasion, leaned into it [embedded post]