When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

AI21 Labs' Jamba 1.5 family of efficient models now available on Microsoft Azure

Azure Jamba 15

AI21 Labs today announced the new Jamba 1.5 family of open models. The Jamba 1.5 family includes two models: Jamba 1.5 Mini and Jamba 1.5 Large. Both models are built on SSM-Transformer architecture to offer groundbreaking efficiency along with support for long context handling, speed, and quality.

The Jamba 1.5 models now support a 256K effective context window, the longest in the market. Notably, this is the first time a non-Transformer model has reached the performance and efficiency of current Transformer-based models. AI21 Labs also claims that the Jamba 1.5 models are up to 2.5X faster on long contexts and the fastest across all context lengths in their size class. In terms of performance, on the Arena Hard AI benchmark, Jamba 1.5 Mini scored 46.1 while Jamba 1.5 Large scored 65.4, making both of these models the best open models available in the market.

In addition to English, the Jamba 1.5 models support several languages, including Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew. They also support developer-friendly features such as structured JSON output, function calling, digesting document objects, and generating citations. Moreover, the models offer a lower memory footprint than other competing models. For example, the Jamba Mini model can handle context lengths of up to 140K tokens on a single GPU.

Pankaj Dugar, SVP and GM of North America at AI21, said:

"We are excited to deepen our collaboration with Microsoft, bringing the cutting-edge innovations of the Jamba Model family to Azure AI users. As an advanced hybrid SSM-Transformer model suite, the Jamba open model family democratizes access to LLMs that offer efficiency, low latency, high quality, and long-context handling. These models elevate enterprise performance and are seamlessly integrated with the Azure AI platform."

Jamba 1.5 family models are now available under the Jamba Open Model License and are available for developers on Microsoft's Azure AI model catalog and other leading cloud providers. On Azure, the Jamba 1.5 models are priced as follows:

  • Jamba 1.5 Large: Pay-as-you-go inference input tokens are 1k for $0.002; pay-as-you-go inference output tokens are 1k for $0.008.
  • Jamba 1.5 Mini: Pay-as-you-go inference input tokens are 1k for $0.0002; pay-as-you-go inference output tokens are 1k for $0.0004.

By releasing these models under the Jamba Open Model License, AI21 Labs is taking a significant step towards democratizing the field of AI.

Source: AI21 Labs

Report a problem with article
The Callisto Protocol and Gigantic Rampage Edition key art
Next Article

The Callisto Protocol and Gigantic: Rampage Edition are free on the Epic Games Store

Galaxy Z fold6 hero
Previous Article

Galaxy Z Fold6 Slim could feature an upgraded under-display camera

Join the conversation!

Login or Sign Up to read and post a comment.

0 Comments - Add comment