In February, Microsoft announced a new partnership with France-based startup Mistral AI. The multi-year agreement lets customers of Microsoft"s Azure AI services access large language models created by Mistral AI on its Azure cloud servers.
The partnership began with Microsoft adding the Mistral Large LLM for Azure AI customers. This week, Microsoft revealed that those same customers can now access the Mistral Small LLM as well.
In a blog post, Microsoft reveals the features of Mistral Small and the kinds of AI tasks it can perform:
- A small model optimized for low latency: Very efficient for high volume and low latency workloads. Mistral Small is Mistral"s smallest proprietary model, it outperforms Mixtral 8x7B and has lower latency.
- Specialized in RAG: Crucial information is not lost in the middle of long context windows. Supports up to 32K tokens.
- Strong in coding: Code generation, review and comments with support for all mainstream coding languages.
- Multi-lingual by design: Best-in-class performance in French, German, Spanish, and Italian - in addition to English. Dozens of other languages are supported.
- Efficient guardrails baked in the model, with additional safety layer with safe prompt option
Microsoft says access to the new LLM is available with an Azure subscription. Its customers can test Mistral Small by using its Azure AI Studio services.
Just a reminder: Mistral AI was formed in 2023 by a number of former employees of Meta and Google"s DeepMind division. It raised $415 million in funding last year. Microsoft"s partnership with the startup also reportedly included a small investment in the company, but the actual financial numbers have yet to be provided.
Microsoft has been a major investor and partner with a similar startup, OpenAI, for several years. Earlier this week, newly revealed internal Microsoft emails showed that in 2019, the company was afraid its main rival, Google, would beat them in AI development. That fear resulted in Microsoft first investing in OpenAI later that year.