Meta has unveiled Code Llama, a large language model explicitly designed for coding tasks. Code Llama is built on top of the previously released Llama 2 model and has been further trained on over 500 billion code and code-related data tokens.
Code Llama aims to assist software developers by generating code and natural language responses to prompts. It supports common programming languages like Python, Javascript, Java, C++ and can help with tasks like code completion and debugging.
Meta is releasing Code Llama in three sizes - 7 billion, 13 billion and 34 billion parameters. The smaller 7B and 13B models are optimized for low latency cases like real-time code completion. The 34B model provides the best overall results but requires more computing power.
Additionally, Meta has fine-tuned Python and Instruct variants of Code Llama. The Python version provides enhanced capabilities for Python code generation tasks. Meanwhile, the Instruct version is fine-tuned to generate safer, more helpful responses to natural language prompts.
Meta says Code Llama has the potential to boost productivity for professional developers as well as lower the barrier to entry for new coders. However, the company acknowledges the risks associated with large language models and believes an open-source approach is best for promoting safety.
Programmers are already using LLMs to assist in a variety of tasks. The goal is to make developer workflows more efficient so they can focus on the most human-centric aspects of their jobs.
Code Llama is designed to support software engineers in all sectors — including research, industry, open source projects, NGOs and businesses. But there are still many more use cases to support.
Code Llama is available for non-commercial research and even commercial use under an open-source license. The company hopes its release will lead to more innovation in AI coding assistants while allowing the community to evaluate capabilities and vulnerabilities.
In July, Microsoft-owned GitHub launched Copilot Chat. Developers can ask Copilot questions about their code, get explanations for specific parts of the code, and even have Copilot fix bugs in the code.