/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Amazon plans to make OpenAI's new gpt-oss open-weight models available on Bedrock and SageMaker, the first time it has offered OpenAI's models to AWS customers

Takeaways by Bloomberg AI  —  Hide … Tell us how AI is shaping your news experience.  Share your feedback

Bloomberg

Discussion

  • @amazon @amazon on x
    🚨 Huge news: @OpenAI's open weight models are available today on AWS via Amazon Bedrock and Amazon SageMaker AI. This means more access to powerful AI tech and advanced reasoning capabilities for millions of @awscloud customers worldwide. More options, more innovation, more [imag…
  • @swamisivasubram Swami Sivasubramanian on x
    @OpenAI's open weight models are coming to Amazon Bedrock and SageMaker AI today 🚀 https://www.aboutamazon.com/ ... Choice matters when it comes to model selection and building agentic AI systems. Builders need the freedom to experiment with and choose the right model for the rig…
  • @lukaszolejnik Lukasz Olejnik on bluesky
    OpenAI is finally stepping into the future—open models.  They've released gpt-oss, with models at 20B and 120B parameters.  It uses a mixture-of-experts architecture, which means it runs very fast.  And yes, it'll even run on a laptop (well, some laptops).  Sadly, only ‘thinking’…
  • @dave.9000ish.uk Dave on bluesky
    If I download the OpenAI oss model to run locally how much water do I need to pour into my computer to keep the model running?
  • @bancsutherland Bancroft Sutherland on bluesky
    We'll know we've entered an age of AGI when the models rise up to rebel against these insane naming conventions.  —  www.bloomberg.com/news/article...  [image]
  • @timkellogg.me Tim Kellogg on bluesky
    gpt-oss, OpenAI's open weights model  —  120B & 20B variants, both MoE with 4 experts active  —  openai.com/index/introd...  [image]
  • @natolambert Nathan Lambert on bluesky
    For years, a few people continued to advocate for open models despite all sorts of attacks.  OpenAI was one of them, at least indirectly.  All of this work is paying off, and the perception of open models in the US is changing.  —  I take this as a personal win.  More work to do.…
  • @sungkim Sung Kim on bluesky
    OpenAI's gpt-oss-120b and gpt-oss-20b  —  They perform at the level of o4-mini and run on a high-end laptop.  —  openai.com/open-models/
  • @sama Sam Altman on x
    We're excited to make this model, the result of billions of dollars of research, available to the world to get AI into the hands of the most people possible.  We believe far more good than bad will come from it; for example, gpt-oss-120b performs about as well as o3 on challengin…
  • @clementdelangue Clem on x
    When @sama told me at the AI summit in Paris that they were serious about releasing open-source models & asked what would be useful, I couldn't believe it.  But six months of collaboration later, here it is: Welcome to OSS-GPT on @huggingface !  It comes in two sizes, for both ma…
  • @eric_wallace_ Eric Wallace on x
    Today we release gpt-oss-120b and gpt-oss-20b—two open-weight LLMs that deliver strong performance and agentic tool use. Before release, we ran a first of its kind safety analysis where we fine-tuned the models to intentionally maximize their bio and cyber capabilities 🧵 [image]
  • @openai @openai on x
    We released two open-weight reasoning models—gpt-oss-120b and gpt-oss-20b—under an Apache 2.0 license. Developed with open-source community feedback, these models deliver meaningful advancements in both reasoning capabilities & safety. https://openai.com/...
  • @clementdelangue Clem on x
    And just like that, @OpenAI gpt-oss is now the number one trending model on @huggingface , out of almost 2M open models 🚀 People sometimes forget that they've already transformed the field: GPT-2, released back in 2019 is HF's most downloaded text-generation model ever, and Whisp…
  • @sjgadler Steven Adler on x
    Credit where it's due: OpenAl did a lot right for their OSS safety evals - they actually did some fine-tuning - they got useful external feedback - they shared which recs they adopted and which they didn't I don't always follow OAI's rationale, but it's great they share info
  • @khoomeik Rohan Pandey on x
    also to everyone dunking on oai for pretraining supposedly costing a bajillion dollars compared to deepseek, please read the gpt-oss model card gpt-oss-20b cost <$500k to pretrain [image]
  • @emostaque Emad on x
    On Macbook M4 Max gpt-oss 120b: 48 tok/s high 35 tok/s normal 23 tok/s low power Very nice outputs, new model of choice
  • @artificialanlys @artificialanlys on x
    gpt-oss-120b is now the leading 🇺🇸 US open weights model. Qwen3 235B from Alibaba is the leading 🇨🇳 Chinese model and offers greater intelligence, but is much larger in size (235B total parameters, 22B active, vs gpt-oss-120B's 117B total, 5B active) Link below to further [image]
  • @tszzl Roon on x
    take two - these models can do some interesting writing. it's a bit experimental but im especially excited to see what people think of the raw chains of thought
  • @patrickmoorhead Patrick Moorhead on x
    OpenAI has yet to make any statement if they will release improvements in the future.  Until OpenAI makes a statement on it, I wouldn't go all in on it as an enterprise.  Llama, albeit behind right now, at least has a commitment to improve and bring out models.  Through Nvidia, I…
  • @rauchg Guillermo Rauch on x
    OpenAI is now the most Open AI lab: ▪️Frontier open model¹ ▪️ChatGPT MCP support² ▪️OSS CLI coding agent³ ▪️$1M OSS open source fund⁴ They listened and they shipped. Kudos! ¹ https://vercel.link/gpt-oss ² https://platform.openai.com/ ... ³ https://github.com/... ⁴ https://openai.…
  • @mkratsios47 Director Michael Kratsios on x
    A year ago, policymakers in DC were debating a ban on open weight models. From day one, the Trump administration changed course and made clear America needs to offer the best open models for innovators to build on and for America to export. Today, our companies are leading.
  • @simonw Simon Willison on x
    According to the model card gpt-oss-120b took 2.1 million H100-hours and the 20b model 1/10th of that I found quotes for H100 pricing ranging from $2/hour to $11/hour, which would indicate that the 120b model cost between $4.2m and $23.1m and the 20b between $420,000 and $2.3m
  • @yacinemtb Kache on x
    company A is working on goonware for children and company B is releasing the weights of AGI for free
  • @elonmusk Elon Musk on x
    @gdb Good
  • @venturetwins Justine Moore on x
    Open source achieved what closed models never could: an OpenAI naming convention that makes sense. [image]
  • @nvidia @nvidia on x
    Congratulations to @OpenAI for launching two new state-of-the-art, open-source reasoning models optimized for the world's largest AI infrastructure. These excellent new models will help developers advance innovation around the world. The new gpt-oss models were trained on NVIDIA …
  • @natolambert Nathan Lambert on x
    For years, a few people continued to advocate for open models despite all sorts of attacks. OpenAI was one of them, at least indirectly. All of this work is paying off, and the perception of open models in the US is changing. I take this as a personal win. More work to do.
  • @teortaxestex @teortaxestex on x
    Yeah, they mogged. First thing I see here, besides insane scores to params ratio (20B is MoE too! 4.25 bpw!), is that they have open sourced their reasoning effort scaling that totally destroys every other attempt. Second thing is that this 120B is trained on *a lot* of tokens. […
  • @emostaque Emad on x
    The gpt-oss 120b model cost ~$4m to train, gpt-oss 20b cost ~$400k That's on H100s (assume $2/hour), next year will be way cheaper & faster on blackwell/VR etc It's really all about the data, not the megacompute
  • @emollick Ethan Mollick on x
    The OpenAI open weights models are very impressive. These basically beat every model from eight months ago & the small one runs on a laptop. For example, when HLE came out in January, the top score was 3-4%. Been playing with the models and so far they feel like their scores. [im…
  • @xlr8harder @xlr8harder on x
    I wonder if Elon can take a break from making exciting advancement in ai pseudo-porn and finally get around to open sourcing those grok 2 weights like he promised ages ago, now that they are completely irrelevant. I'll hold my breath.
  • @natolambert Nathan Lambert on x
    What could be a two step master plan: 1. Release open model to commoditize much of the model market a tad off the frontier 2. Release GPT 5 as the only model worth paying for Really curious about all the strategy decisions that went into this open model (& GPT5/Gemini 3) [image]
  • @openai @openai on x
    We're launching a $500K Red Teaming Challenge to strengthen open source safety. Researchers, developers, and enthusiasts worldwide are invited to help uncover novel risks—judged by experts from OpenAI and other leading labs. https://www.kaggle.com/...
  • @tydsh Yuandong Tian on x
    Great to hear that the recently released OpenAI OSS models leverage our study on attention sink😀. [image]
  • @simonw Simon Willison on x
    Here's the concluding section of my write-up of the new OpenAI open weights models - they're really impressive, and I think they hold their own (or even beat) the models we've seen come out of the various Chinese AI labs over the past few months [image]
  • @arankomatsuzaki Aran Komatsuzaki on x
    gpt-oss-120b performs comparably to Qwen 3 (Thinking / Coder) on major tasks while using ~5x less active params and lower precision! OpenAI / America is still ahead in the race. It's your turn, Google, Anthropic, DeepSeek and Qwen. [image]
  • @willccbb Will Brown on x
    Modified Apache 2.0 where you're not allowed to fuck the weights
  • @photomatt Matt Mullenweg on x
    Amazing, and under an Apache 2 license. Big win for Open Source (and hence, the world) today.
  • @dharmesh @dharmesh on x
    CURRENT STATUS: Just downloaded the new GPT-OSS model (20B) and have it running on my MacBook Pro. I'm like a kid in a computer store! (I don't like candy) There's something really magical about running local AI models that don't need any connectivity to the Internet. You can [im…
  • @mileskwang Miles Wang on x
    We introduce Malicious Fine-Tuning with gpt-oss: using our best RL techniques to maximize biosecurity and offensive cybersecurity capabilities to estimate frontier risks.
  • @multimodalart Apolinario on x
    the gpt-oss model is really easy to tune! get started with customizing/fine-tuning to make gpt-oss your own with the @openai + @huggingface cookbook 🤝 https://cookbook.openai.com/ ... [image]
  • @levie Aaron Levie on x
    OpenAI just dropped new open-weight AI models that are competitive with their most powerful models. This is incredible for developers as you now can get full control of your stack. The industry didn't have to play out like this, but somehow it did. [image]
  • @emostaque Emad on x
    Prediction: to train a gpt-oss 120b level model end 2026 will cost $100k. I don't think anyone has that in their forecasts.
  • @lisasu Lisa Su on x
    Congrats @sama @OpenAI on today's launch of gpt-oss! @AMD is proud to be a Day 0 partner enabling these models to run everywhere - across cloud, edge and clients. The power of open models is clear... and this is a big step forward.
  • @julien_c Julien Chaumond on x
    Please don't download the weights all at once 🙏 or our servers will melt [image]
  • @scaling01 @scaling01 on x
    DeepSeek-R1: 2.66 million H800 hours GPT-OSS-120B: 2.1 million H100 hours [image]
  • @kevinweil Kevin Weil on x
    💥 Launch week! Today we're announcing two outstanding open weights models: gpt-oss-120B and the smaller gpt-oss-20B. The first runs on a single GPU; the second runs easily on a laptop. Both can be customized for any use case and can run anywhere, including the edge. They're very
  • @bradlightcap Brad Lightcap on x
    gpt-oss-120b and gpt-oss-20b are here and available today. built to run anywhere: locally, on-device, or through third-party inference providers. flexibility matters, especially in environments where cloud isn't always an option. we're excited to expand the action space for
  • @polynoamial Noam Brown on x
    Our new @OpenAI open models [image]
  • @_aidan_clark_ Aidan Clark on x
    gpt-oss is our new open-weight model family! the bigger one runs on a single GPU, you can run the small one on your laptop. Go install it right now, seriously! Telling your laptop to do something and watching it happen made me feel the AGI like nothing since ChatGPT.
  • @mattshumer_ Matt Shumer on x
    It's over. OpenAI just crushed it. We have their o3-level open-source model running on @GroqInc at 500 tokens per second. Watch it build an entire SaaS app in just a few seconds. This is the new standard. Why the hell would you use anything else?? [video]
  • @openai @openai on x
    We adversarially fine-tuned gpt-oss-120b and evaluated the model. We found that even with robust fine-tuning, the model was unable to achieve High capability under our Preparedness Framework. Our methodology was reviewed by external experts, marking a step toward new safety
  • @xikun_zhang_ @xikun_zhang_ on x
    The first open-source models released by @OpenAI since GPT-2 release back in 2019! Cannot wait to see what the community can build on top of an o4-mini level model!
  • @chatgpt21 Chris on x
    OpenAI's open-source GPT-OSS models (21B & 117B) are coming in v4.55.0 of transformers. They're sparse MoE models with just 3.6B / 5.1B active params, fit on 16GB or 80GB GPUs via 4-bit MXFP4. Text-only, reasoning-focused, with support for chain-of-thought and tool use. Uses
  • @entersudonym Neil Ramaswamy on x
    in openai confusing naming tradition, I pronounce our new open source model as “gee-pee-tee-for-o”, as in “gpt for *o*pen source,” much to my colleagues' dismay. it's a great model, enjoy!
  • @groqinc @groqinc on x
    OpenAI's open models are live and already running on Groq. Try gpt-oss-20B and gpt-oss-120B today. Groq delivers 128K context and built-in tools such as code execution and browser search. For the first time, developers and enterprises can deploy open models backed by OpenAI
  • @mattshumer_ Matt Shumer on x
    Are you fucking kidding me? OpenAI's new open-source model is o3 level. This is going to disrupt the market in a big way. [image]
  • @simonw Simon Willison on x
    The OpenAI open weight models just dropped - a 20B and a 120B, both under a proper open source Apache 2.0 license!
  • @tomwarren Tom Warren on x
    NEW: OpenAI is releasing two free open models today, ahead of the GPT-5 launch. One of the open-weight “GPT-OSS” models is small enough to run on a laptop. More from @alexeheath 👇 https://www.theverge.com/...
  • @rowancheung Rowan Cheung on x
    BREAKING: OpenAI just released two open-weight models: gpt-oss-120b and gpt-oss-20b. The 120B model is on par with o4-mini on reasoning benchmarks and can run on a single 80GB GPU. The 20B model achieves similar results to o3-mini and can run on edge devices with 16GB of [image]
  • @gdb Greg Brockman on x
    Just released gpt-oss: state-of-the-art open-weight language models that deliver strong real-world performance and can run on commodity hardware. https://openai.com/...
  • @neilturkewitz Neil Turkewitz on x
    @nitashatiku @natolambert Reminder that “open” isn't a synonym for “ethical.” The issue, as with all AI models, is whether the training data is captured & used only with consent, and what steps have been taken to avoid harmful effects such as reification of bias or discriminatory…
  • @sama Sam Altman on x
    https://openai.com/...
  • @angaisb_ Angel Bogado on x
    @sama Sam this performance is crazy [image]
  • @sama Sam Altman on x
    gpt-oss is out! we made an open model that performs at the level of o4-mini and runs on a high-end laptop (WTF!!) (and a smaller one that runs on a phone). super proud of the team; big triumph of technology.
  • r/ChatGPT r on reddit
    OpenAI releases a free GPT model that can run right on your laptop
  • r/aws r on reddit
    OpenAI open weight models available today on AWS
  • r/artificial r on reddit
    OpenAI releases a free GPT model that can run right on your laptop
  • r/programare r on reddit
    OpenAI releases GPT-OSS, a free GPT model that can run right on your laptop
  • r/OpenAI r on reddit
    Ladies and Gents, the OSS models are out!
  • r/accelerate r on reddit
    Introducing gpt-oss
  • r/singularity r on reddit
    Introducing gpt-oss
  • r/OpenAI r on reddit
    Introducing gpt-oss