GitHub Copilot utilizes various large language models (LLMs) for different purposes. When initially launched, it employed Codex, an early iteration of OpenAI's GPT-3, fine-tuned specifically for coding tasks. In 2023, with the launch of GitHub Copilot Chat, it was powered by OpenAI's GPT-3.5 and, subsequently, GPT-4. As OpenAI released newer models, GitHub Copilot's base model was also updated, transitioning from GPT-3.5-turbo to GPT-4o and 4o-mini models based on latency and quality requirements.
Today, at GitHub Universe, the GitHub team announced the integration of Anthropic's Claude 3.5 Sonnet, Google's Gemini 1.5 Pro, and OpenAI's o1-preview and o1-mini into GitHub Copilot. This marks the first time GitHub Copilot is offering developers the choice to select their preferred models.
These new models from Google, Anthropic, and OpenAI will be rolled out initially in Copilot Chat. The new OpenAI o1-preview and o1-mini models are already available. Claude 3.5 Sonnet will be progressively rolled out over the next week, and Google's Gemini 1.5 Pro will be available in the coming weeks. These models will also be integrated into Copilot Workspace and other relevant features within GitHub.
Thomas Dohmke, CEO of GitHub, stated the following regarding multi-model support in GitHub Copilot:
"In the past year, we have witnessed a surge in high-quality small and large language models that individually excel at different programming tasks. It is clear that the next phase of AI code generation will be defined not only by multi-model functionality but also by multi-model choice. GitHub is committed to its ethos as an open developer platform and ensuring every developer has the agency to build with the models that best suit their needs. Today at GitHub Universe, we delivered just that."
This multi-model GitHub Copilot experience is now accessible in GitHub Copilot Chat on github.com, Visual Studio Code, and through Copilot extensions for Visual Studio. Organizations and enterprises will have full control over which models they enable for their developers.
0 Comments - Add comment