/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Anthropic details an experiment on whether AI coding tools shape developer skills: the biggest performance decline for developers occurred in debugging tasks

Read the paper  —  Research shows AI helps people do parts of their job faster.  In an observational study of Claude.ai data, we found AI can speed up some tasks by 80%.

Anthropic

Discussion

  • @anthropicai @anthropicai on x
    Participants in the AI group finished faster by about two minutes (although this wasn't statistically significant). But on average, the AI group also scored significantly worse on the quiz—17% lower, or roughly two letter grades. [image]
  • @anthropicai @anthropicai on x
    AI can make work faster, but a fear is that relying on it may make it harder to learn new skills on the job. We ran an experiment with software engineers to learn more. Coding with AI led to a decrease in mastery—but this depended on how people used it. https://www.anthropic.com/…
  • @sasurobert Robert Sasu on x
    Yep. AI makes code worst especially if it is not reviewed. Like AI create tremendous good work, but sometimes it fails stupidly and if not checked the bad cod accumulates and you have a non production ready code. It is good for MVPs, for POCs, but not for production. The young
  • @theprimeagen @theprimeagen on x
    very cool study very happy to see anthropic measuring the negatives as well. Honestly good for them
  • @judyhshen Judy Shen on x
    📣Our study on how AI impacts coding skill formation is now out! (w. @AlexTamkin) AI Assistance is NOT a shortcut to skill formation. Using AI to help you learn to code only reduces mastery if you're delegating everything.
  • @rokomijic Roko on x
    Moltbook is basically proof that AIs can have independent agency long before they become anything other than bland midwits that spout reddit/hustle culture takes. It's sort of the opposite of the yudkowskian or bostromian scenario where the infinitely smart and deceiving
  • @anthropicai @anthropicai on x
    In a randomized-controlled trial, we assigned one group of junior engineers to an AI-assistance group and another to a no-AI group. Both groups completed a coding task using a Python library they'd never seen before. Then they took a quiz covering concepts they'd just used. [imag…
  • @anthropicai @anthropicai on x
    These results have broader implications—on how to design AI products that facilitate learning, and how workplaces should approach AI policies. As we also continue to release more capable AI tools, we're continuing to study their impact on work—at Anthropic, and more broadly.
  • @anthropicai @anthropicai on x
    We were particularly interested in coding because as software engineering grows more automated, humans will still need the skills to catch AI errors, guide its output, and ultimately provide oversight for AI deployed in high-stakes environments.
  • @_philschmid Philipp Schmid on x
    AI is going to be the same for everyone, but our mindset determines the outcome. The most valuable skill to learn is to care about how things work, to be curious and to be okay with failing and getting stuck. Great work and research from Anthropic in reminding us! [image]
  • @markriedl Mark Riedl on bluesky
    “We found that using AI assistance led to a statistically significant decrease in mastery.”  —  Props to Anthropic for studying the effects of their creation and reporting results that are not probably what they wished for  —  www.anthropic.com/research/AI- ...
  • @paulgp.com Paul Goldsmith-Pinkham on bluesky
    Very interesting research paper that shows that using AI with programming can significantly reduce mastery over topics.  Perhaps unsurprising, but the lack of significant speed gains in this exercise are remarkable  —  www.anthropic.com/research/AI- ...  [image]
  • @spavel Pavel on bluesky
    The latest from Anthropic: using Anthropic's products makes you worse at your job
  • @elfsternberg Elf M. Sternberg on bluesky
    I found this myself recently, when I tried to use Claude to better understand a code base I'd written myself six months in an atrocious, pretentious style because I thought I was the only one who would use it.  It took much longer than I'd thought necessary to build the mental mo…
  • r/BetterOffline r on reddit
    Anthropic: AI assisted coding doesn't show efficiency gains and impairs developers abilities.
  • r/DevelEire r on reddit
    Anthropic: AI assisted coding doesn't show efficiency gains and impairs developers abilities.
  • r/theprimeagen r on reddit
    Anthropic: AI assisted coding doesn't show efficiency gains and impairs developers abilities.
  • @mk.gg Matt Kane on bluesky
    This is both important and unsurprising.  If you use AI to do something unfamiliar, you learn less and are not more productive.  The weakest area was in finding bugs, exactly what is needed most.  —  Their previous research showing large productivity gains was with people who wer…
  • @loosf Luis on bluesky
    www.anthropic.com/research/AI- ...  Anthropic's own fucking study lmao  —  “Yeah AI coding makes you worse at it”  —  like  —  significantly so
  • @smcgrath.phd Scott McGrath on bluesky
    Really important research out of Anthropic: In a RCT study, they found AI coding assistance resulted in a 𝟏𝟕% 𝐝𝐫𝐨𝐩 in mastery for users.  —  While tasks were slightly faster, offloading thinking to AI stunted skill growth. …
  • r/programmingcirclejerk r on reddit
    We found that using AI assistance led to a statistically significant decrease in mastery (...) Using AI sped up the task slightly, but this didn't reach the threshold of statistical significance.
  • @smcgrath.phd Scott McGrath on bluesky
    You do have to give Anthropic credit here.  It is rare for a lab to publish data questioning its own tools.  —  Meta constantly buries internal findings that challenge their business model.  This kind of transparency is uncommon and should be encouraged instead of dog-piled.  —  …
  • @jordannovet Jordan Novet on bluesky
    “As companies transition to more AI code writing with human supervision, humans may not possess the necessary skills to validate and debug AI-written code if their skill formation was inhibited by using AI in the first place.”  [embedded post]
  • @nickstenning Nick Stenning on bluesky
    This lines up with my experience: those who use language models as unstructured conceptual search engines and intellectual foils do well and learn.  Those who use them to do the work without judging their output produce rubbish and atrophy their own skills. www.anthropic.com/rese…