Microsoft Readies New AI Model To Compete With Google, OpenAI
For the first time since it invested more than $10 billion into OpenAI in exchange for the rights to reuse the startup's AI models, Microsoft is training a new, in-house AI model large enough to compete with state-of-the-art models from Google, Anthropic and OpenAI itself. The Information: The new model, internally referred to as MAI-1, is being overseen by Mustafa Suleyman, the ex-Google AI leader who most recently served as CEO of the AI startup Inflection before Microsoft hired the majority of the startup's staff and paid $650 million for the rights to its intellectual property in March. But this is a Microsoft model, not one carried over from Inflection, although it may build on training data and other tech from the startup. It is separate from the Pi models that Inflection previously released, according to two Microsoft employees with knowledge of the effort. MAI-1 will be far larger than any of the smaller, open source models that Microsoft has previously trained, meaning it will require more computing power and training data and will therefore be more expensive, according to the people. MAI-1 will have roughly 500 billion parameters, or settings that can be adjusted to determine what models learn during training. By comparison, OpenAI's GPT-4 has more than 1 trillion parameters, while smaller open source models released by firms like Meta Platforms and Mistral have 70 billion parameters. That means Microsoft is now pursuing a dual trajectory of sorts in AI, aiming to develop both "small language models" that are inexpensive to build into apps and that could run on mobile devices, alongside larger, state-of-the-art AI models.
Read more of this story at Slashdot.