/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Sources: Microsoft is training MAI-1, a new, in-house AI model with ~500B parameters, large enough to compete with top models from Google, Anthropic, and OpenAI

The Information Aaron Holmes

Discussion

  • @marcslove Marc Love on threads
    If Microsoft Sherlocks OpenAI, I'm going to laugh about it for a good month.
  • @_xjdr @_xjdr on x
    Assuming this is correct (no idea) it is interesting that both Meta and MS have decided on ~400B - 500B param (assuming dense) models. Also interesting that is also roughly the same amount of activated params as the rumored GPT4 architecture. doesn't mean much by itself, but it..…
  • @garymarcus Gary Marcus on x
    Today's big scoop from @theinformation is that Microsoft is training a giant model to compete with OpenAI. This fits into a larger pattern, going back over a year. Nadella clearly doesn't want to be dependent on OpenAI. And because OpenAI doesn't appear to have a lot of unique...
  • @danielnewmanuv Daniel Newman on x
    Microsoft is now working on its own Model (LLM) MAI-500B.  My Thoughts 👇🏻 No surprise that Microsoft, despite its deep connection to OpenAI, is going to develop and build its own LLMs.  What I don't see is this as an “attack” of any sort on OpenAI, but rather a diversification of…
  • @far__el Far El on x
    the (near) future is not 500B param models. that is just what bloated companies with no imagination can and will continue to do.
  • @bindureddy Bindu Reddy on x
    As I predicted, Microsoft is training its own LLM It's called MAI-1, a 500B param model, and may be previewed at their Build conference. When this model becomes available, it will only be natural for MSFT to push this instead of the GPT line. As predicted, OAI and MSFT are... [im…
  • @zerohedge @zerohedge on x
    MICROSOFT READIES NEW AI MODEL TO COMPETE WITH GOOGLE, OPENAI - THE INFORMATION Does this one actually make money?
  • @xlr8harder @xlr8harder on x
    This is an interesting development. I suppose MS would not want to depend entirely on OpenAI. It's starting to look quite crowded, though.
  • @dhinchcliffe Dion Hinchcliffe on x
    Much industry talk today about @Microsoft training it's own #LLM. Known as #MAI-1, it has 500 billion parameters. The big Q is how @OpenAI will be positioned in the mix. My take: Microsoft has too few variety of models + this will diversify. But will alter OpenAI partnership.
  • @aaronpholmes Aaron Holmes on x
    Scoop: Microsoft is training its own large language model, internally labaled MAI-1, with Mustafa Suleyman leading the effort. The model is around 500B parameters and could compete directly with LLMs from Google, OpenAI, etc. Details: https://www.theinformation.com/ ...
  • @how_many_roads_ Dave Hayes on x
    @bindureddy While a smart model is important... the ability to build an agent framework around it is the only way to provide value (especially at the enterprise scale) MSFT has the absolute worst framework to do this... and their bloated ecosystem will never be as streamlined and…
  • @andrewcurran_ Andrew Curran on x
    There have been indications for some time that Microsoft was intending to take their own path, and were planning to train a bespoke model to compete with GPT-4. We now have a name. ‘MAI-1’ As well as a size: 500b.
  • @dorialexander Alexander Doria on x
    Meanwhile the best supervised model in the world is 304M parameters. I think we have lost the plot somewhere.
  • @alice_comfy @alice_comfy on x
    @bindureddy Would assume a part of the motivation is Microsoft wanting to make copilot a local LLM, and OpenAI being against that.
  • @steph_palazzolo Stephanie Palazzolo on x
    AI Agenda scoop from @aaronpholmes: Microsoft is training its own LLM, led by Mustafa Suleyman. The model is ~500Bn parameters, showing how MSFT is dual tracking its AI efforts, building both small, cheap models and larger ones that compete with OpenAI. https://www.theinformation…
  • @benbajarin Ben Bajarin on x
    I mentioned this not long after the Open AI relationship was clear last year but every company wanting to be an AI platform can't afford to outsource their primary model. This is the easiest prediction of vertical integration there is at the moment.
  • @aaronpholmes Aaron Holmes on x
    In a LinkedIn post just now, Microsoft CTO Kevin Scott confirms my report this morning that MAI is Microsoft's new AI model in the works (while tamping down hype): [image]
  • r/artificial r on reddit
    Microsoft readies new AI model to compete with Google, OpenAI
  • r/singularity r on reddit
    Microsoft is working on a 500B model called MAI-1