SCROLL

Jamba

A Groundbreaking SSM - Transformer Open Model

Debuting the first production-grade Mamba-based model delivering best-in-class quality and performance.

The best of both worlds

Mamba, the novel Structured State Space model (SSM) architecture, was designed to address the limitations of the traditional Transformer architecture, but it has shortcomings of its own. Jamba offers the best of both worlds.

Transformer
mamba

Jamba outperforms or matches other models in its size class

Jamba scores the highest on reasoning related benchmarks.

    Unprecedented throughput gains

    Jamba delivers 3X throughput on long contexts, making it the most efficient model in its size class.

    Prioritizing cost-effectiveness & accessible deployment

    Jamba is the only model of its size that fits 140K context on a single GPU.

    Jamba introduces
    a novelLLM architecture

    Jamba is built on top of an SSM-Transformer mixture-of-experts (MoE)
    architecture. It is based on hybrid interleaving Transformer & SSM layers,
    enjoying the benefits of both architectures.

    An Innovative Open Model
    to Power Experimentation

    learn more
    A Base Model Designed
for Builders

    Jamba is a base model intended for use

    as a foundation layer for fine tuning, training, & developing custom solutions.

    start building

    Experiment
    Explore
    Build
    With Jamba.

    A Base Model
    Designed for Builders

    Jamba is a base model intended for use
as a foundation layer for fine tuning, training, 
& developing custom solutions.

    start building

    Experiment
    Explore
    Build
    With Jamba.

    AI21 builds reliable, practical, and scalable AI solutions for the enterprise that solve key business challenges. To learn more about our other offerings:

    Talk to Us