/plushcap/analysis/ai21-labs/ai21-labs-announcing-jamba-model-family

The Jamba 1.5 Open Model Family: The Most Powerful and Efficient Long Context Models

What's this blog post about?

AI21 has introduced the Jamba 1.5 family of open models, including Jamba 1.5 Mini and Jamba 1.5 Large, which are built on a novel SSM-Transformer architecture. These models offer superior long context handling, speed, and quality compared to competitors in their size class. The Jamba 1.5 models also have the longest context window among open models at 256K tokens. They are available under the Jamba Open Model License, promoting accessibility and further experimentation. The models' efficiency-optimized architecture allows for top quality and speed without high costs. ExpertsInt8, a novel quantization technique tailored for MoE models, is utilized to ensure fast performance and maintain quality. These models are suitable for various enterprise applications such as customer support agent assistants and chatbots, offering significant cost, quality, and speed gains under high utilization when deployed in their own environment.

Company
AI21 Labs

Date published
Aug. 22, 2024

Author(s)
-

Word count
908

Hacker News points
None found.

Language
English


By Matt Makai. 2021-2024.