/plushcap/analysis/ai21-labs/ai21-labs-long-context-yoav-shoham

Long Context, But Actually

What's this blog post about?

AI21 Labs Co-CEO Yoav Shoham discusses how they built Jamba-Instruct, a foundation model with a context window of 256K tokens, to close the gap between claimed and effective context window length. The model is designed to efficiently serve long context workflows and offers a longer context than most competing models. Key questions addressed include whether having a long context window means the model does something useful with it, if long context models can be served with acceptable latency and unit economics, and if long context matters as much in RAGish days.

Company
AI21 Labs

Date published
June 26, 2024

Author(s)
-

Word count
3912

Hacker News points
None found.

Language
English


By Matt Makai. 2021-2024.