In April of this year, Microsoft first announced the Phi-3 family of SLMs (Small Language Models), which offer great performance at a low cost and with low latency. The Phi-3-mini is a 3.8B language model available in two context-length variants, 4K and 128K tokens. The Phi-3-medium is a 14B language model, also available in the same two context-length variants.
Microsoft today announced that both Phi-3-mini and Phi-3-medium are available for fine-tuning on Azure. Fine-tuning allows developers to improve the base model"s performance for different use cases. For example, one could fine-tune the Phi-3-medium model for student tutoring, or someone could build a chat app based on a particular tone or style of response. Leading organizations like Khan Academy are already using the Phi-3 model in real-world AI applications.
Microsoft also announced the general availability of Models-as-a-Service (serverless endpoint) capability. As expected, the Phi-3-small model is now available via a serverless endpoint, allowing anyone to quickly develop AI applications without worrying about the underlying infrastructure. Phi-3-vision, the multi-modal model available through the Azure AI model catalog, will soon be available via a serverless endpoint as well.
Last month, Microsoft updated the Phi-3-mini model to offer significant improvements. As per industry benchmarks, Phi-3-mini-4k now scores 35.8 (previously 21.7) with the June 2024 update, and Phi-3-mini-128k scores 37.6 (previously 25.7).
Microsoft also highlighted the new models that were recently made available on Azure:
- OpenAI GPT-4o mini
- Mistral Large 2
- Meta Llama 3.1 family of models
- Cohere Rerank
The new Azure AI Content Safety features including prompt shields and protected material detection are now enabled by default for Azure OpenAI Service. Developers can use these features as content filters for any foundation model including Phi-3, Llama, Mistral and more.
With these updates and expansions, Microsoft is clearly demonstrating its commitment to advancing AI capabilities on Azure. By continuously investing in state-of-the-art AI models availability on Azure and providing accessible tools for fine-tuning and deployment, Microsoft is enabling developers to easily create AI solutions.
Source: Microsoft