Microsoft may have a major partnership with OpenAI to use its GPT language models in Bing Chat. However, it"s also helping to develop another large language model for a much more established tech company, Meta.
Today, as part of Microsoft Inspire 2023, the company announced a new partnership with Meta to launch Llama 2, the second-generation version of Meta"s LLM.
In its blog post, Microsoft stated:
Llama 2 is designed to enable developers and organizations to build generative AI-powered tools and experiences. Meta and Microsoft share a commitment to democratizing AI and its benefits and we are excited that Meta is taking an open approach with Llama 2. We offer developers choice in the types of models they build on, supporting open and frontier models and are thrilled to be Meta’s preferred partner as they release their new version of Llama 2 to commercial customers for the first time.
Meta CEO Mark Zuckerberg revealed more info on Llama 2 in a Facebook post, including its differences from its previous large language model, Llama 1:
Llama 2 was pretrained on 40% more data than Llama 1 and has improvements to its architecture. For the fine-tuned models, we collected more than 1 million human annotations and applied supervised fine-tuning and reinforcement learning with human feedback (RLHF) with leading results on safety and quality.
Llama 2 is being released as open-source software and can be used for free for research and commercial purposes. It"s available via Microsoft"s Azure AI model catalog for cloud-based solutions, and it"s also available to run locally on Windows. Microsoft stated:
Llama 2 is the latest addition to our growing Azure AI model catalog. The model catalog, currently in public preview, serves as a hub of foundation models and empowers developers and machine learning (ML) professionals to easily discover, evaluate, customize and deploy pre-built large AI models at scale.
In addition to Azure, Meta says Llama 2 will be available from other providers including Amazon Web Services, Hugging Face and more.