Microsoft Introduces Phi-3-mini, The Smallest AI Language Model
Microsoft introduced a lightweight generative AI model, Phi-3-mini, on Tuesday, April 23. This cost-effective version is the newest addition to the company's previous small AI models, Phi-1 and Phi-2.
Phi-3 is not slightly cheaper, it's dramatically cheaper, we're talking about a 10x cost difference compared to other models out there with similar capabilities. - Sebastien BubekThe Phi-3 launch came soon after Microsoft released the Phi-2 model in December, which worked just as well as larger models like Llama 2.
With 3.8B parameters, Phi-3-mini is just as powerful as large language models (LLMs) such as GPT-3.5. Furthermore, it boasts a smaller, less complex design trained on less data.The Phi-3-mini is designed for devices with limited computing power, like smartphones and laptops. It increases accessibility, decreases the need for cloud-based operations, and improves user engagement while supporting complex tasks on a local device.
Microsoft will also introduce two additional models to the Phi-3 family: Phi-3-small (with 7B parameters) and Phi-3-medium (with 14B parameters). Both will shortly be available in the AI Azure Model Catalog and other model gardens.
The Making of The Phi-3 FamilyEric Boyd, the Corporate VP of Microsoft Azure explained the difference between the different Phi models. He said Phi-1 was focused on coding, Phi-2 began to learn to reason, and Phi-3 improved on both versions. This is because Phi-3 is better at both coding and reasoning.
Eric Boyd and his team developed a method inspired by how children learn. The developers trained Phi-3 with a curriculum.'
Their inspiration came from how children pick up knowledge from bedtime stories, books with simpler words, and sentence structures that talk about larger topics.
Boyd added, There aren't enough children's books out there, so we took a list of more than 3,000 words and asked an LLM to make children's books" to teach Phi.' In short, Microsoft leveraged AI to teach AI, a first-of-its-kind move in the industry.
Phi-3's Availability on PlatformsPhi-3-mini is now available on Microsoft's cloud service platform Azure, Hugging Face, and Ollama, a framework for running models on a local machine.
Additionally, it's available on Nvidia's tool Nvidia Inference Microservices where it has been optimized for its graphics processing units.
What are SLMs?Phi-3-mini is a generative AI small language model. SLMs are packed with significantly fewer parameters, ranging from millions to a few billion. In comparison, LLMs contain billions and even trillions of parameters.
Let's look at a few more differences between LLMs and SLMs:
- Efficient and cost-effective: SLMs are more cost-effective and accessible for a larger variety of users and organizations. Their integration with smartphones will further support more advanced personal assistant features.
- Faster inference time: An SLM's compact design offers quicker response times, which is necessary for real-time applications.
- Environmental impact: Smaller AI models have a smaller carbon footprint than larger models.
- Ease of integration: SLMs are easier to integrate with existing applications on smartphones or in areas with limited access to computers.
- Specialization and customization: SLMs can easily be customized to suit specific needs for the most relevant outputs.
Introducing the smallest language model, Phi-3, is not the only advancement the company is making toward AI growth.
Microsoft partnered with French startup Mistral AI allowing the company to provide its models through the Azure cloud computing platform. As part of the deal, Microsoft will invest $16.3M in Mistral AI.
On April 26, Microsoft also beat Wall Street's estimates by an impressive $1B, driven by AI investment for third-quarter revenue and profit.
In what's a reasonable assumption, Microsoft's AI push and its latest innovations to fill the gaps in the AI industry will likely boost its overall revenue and upgrade the technology standards on a huge scale.
The post Microsoft Introduces Phi-3-mini, The Smallest AI Language Model appeared first on The Tech Report.