search
HomeTechnology peripheralsAIJamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture

Jamba 1.5: A Powerful Hybrid Language Model for Long-Context Processing

Jamba 1.5, a cutting-edge large language model from AI21 Labs, boasts impressive capabilities for handling extensive text contexts. Available in two versions – Jamba 1.5 Large (94 billion parameters) and Jamba 1.5 Mini (12 billion parameters) – it leverages a unique hybrid architecture combining the Mamba Structured State Space Model (SSM) with the traditional Transformer architecture. This innovative approach enables processing of an unprecedented 256K effective context window, a significant leap for open-source models.

Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture

Key Features and Capabilities:

  • Massive Context Window: Processes up to 256K tokens, ideal for lengthy documents and complex tasks.
  • Hybrid Architecture: Combines the strengths of Transformer and Mamba models for optimal efficiency and performance.
  • Efficient Quantization: Employs ExpertsInt8 quantization for reduced memory footprint and faster processing.
  • Multilingual Support: Functions effectively across nine languages: English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.
  • Versatile Applications: Suitable for a wide range of NLP tasks, including question answering, summarization, text generation, and classification.
  • Accessible Deployment: Available via AI21's Studio API, Hugging Face, and cloud partners.

Architectural Details:

Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture

Aspect Details
Base Architecture Hybrid Transformer-Mamba architecture with a Mixture-of-Experts (MoE) module
Model Variants Jamba-1.5-Large (94B active parameters, 398B total) and Jamba-1.5-Mini (12B active parameters, 52B total)
Layer Composition 9 blocks, each with 8 layers; 1:7 ratio of Transformer to Mamba layers
Mixture of Experts (MoE) 16 experts, selecting the top 2 per token
Hidden Dimensions 8192
Attention Heads 64 query heads, 8 key-value heads
Context Length Up to 256K tokens
Quantization Technique ExpertsInt8 for MoE and MLP layers
Activation Function Integrated Transformer and Mamba activations
Efficiency Optimized for high throughput and low latency on 8x80GB GPUs

Accessing and Utilizing Jamba 1.5:

Jamba 1.5 is readily accessible through AI21's Studio API and Hugging Face. The model can be fine-tuned for specific domains to further enhance performance. A Python example using the AI21 API is provided below:

Python Example:

from ai21 import AI21Client
from ai21.models.chat import ChatMessage

messages = [ChatMessage(content="What's a tokenizer in 2-3 lines?", role="user")]
client = AI21Client(api_key='') # Replace '' with your API key
response = client.chat.completions.create(
    messages=messages,
    model="jamba-1.5-mini",
    stream=True
)
for chunk in response:
    print(chunk.choices[0].delta.content, end="")

Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture

Conclusion:

Jamba 1.5 represents a significant advancement in large language models, offering a compelling blend of power and efficiency. Its ability to handle exceptionally long contexts, coupled with its versatile applications and accessible deployment options, makes it a valuable tool for a wide range of NLP tasks.

Frequently Asked Questions (FAQs): (Similar to the original, but rephrased for conciseness)

  • Q1: What is Jamba 1.5? A: A hybrid Transformer-Mamba large language model with 94B (Large) or 12B (Mini) parameters, optimized for instruction following and long-context processing.
  • Q2: How does Jamba 1.5 handle long contexts efficiently? A: Through its hybrid architecture and ExpertsInt8 quantization, enabling a 256K token context window with reduced memory usage.
  • Q3: What is ExpertsInt8 quantization? A: A compression technique using INT8 precision in MoE and MLP layers for improved efficiency.
  • Q4: Is Jamba 1.5 publicly available? A: Yes, under the Jamba Open Model License, accessible via Hugging Face.

The above is the detailed content of Jamba 1.5: Featuring the Hybrid Mamba-Transformer Architecture. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
A Business Leader's Guide To Generative Engine Optimization (GEO)A Business Leader's Guide To Generative Engine Optimization (GEO)May 03, 2025 am 11:14 AM

Google is leading this shift. Its "AI Overviews" feature already serves more than one billion users, providing complete answers before anyone clicks a link.[^2] Other players are also gaining ground fast. ChatGPT, Microsoft Copilot, and Pe

This Startup Is Using AI Agents To Fight Malicious Ads And Impersonator AccountsThis Startup Is Using AI Agents To Fight Malicious Ads And Impersonator AccountsMay 03, 2025 am 11:13 AM

In 2022, he founded social engineering defense startup Doppel to do just that. And as cybercriminals harness ever more advanced AI models to turbocharge their attacks, Doppel’s AI systems have helped businesses combat them at scale— more quickly and

How World Models Are Radically Reshaping The Future Of Generative AI And LLMsHow World Models Are Radically Reshaping The Future Of Generative AI And LLMsMay 03, 2025 am 11:12 AM

Voila, via interacting with suitable world models, generative AI and LLMs can be substantively boosted. Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including

May Day 2050: What Have We Left To Celebrate?May Day 2050: What Have We Left To Celebrate?May 03, 2025 am 11:11 AM

Labor Day 2050. Parks across the nation fill with families enjoying traditional barbecues while nostalgic parades wind through city streets. Yet the celebration now carries a museum-like quality — historical reenactment rather than commemoration of c

The Deepfake Detector You've Never Heard Of That's 98% AccurateThe Deepfake Detector You've Never Heard Of That's 98% AccurateMay 03, 2025 am 11:10 AM

To help address this urgent and unsettling trend, a peer-reviewed article in the February 2025 edition of TEM Journal provides one of the clearest, data-driven assessments as to where that technological deepfake face off currently stands. Researcher

Quantum Talent Wars: The Hidden Crisis Threatening Tech's Next FrontierQuantum Talent Wars: The Hidden Crisis Threatening Tech's Next FrontierMay 03, 2025 am 11:09 AM

From vastly decreasing the time it takes to formulate new drugs to creating greener energy, there will be huge opportunities for businesses to break new ground. There’s a big problem, though: there’s a severe shortage of people with the skills busi

The Prototype: These Bacteria Can Generate ElectricityThe Prototype: These Bacteria Can Generate ElectricityMay 03, 2025 am 11:08 AM

Years ago, scientists found that certain kinds of bacteria appear to breathe by generating electricity, rather than taking in oxygen, but how they did so was a mystery. A new study published in the journal Cell identifies how this happens: the microb

AI And Cybersecurity: The New Administration's 100-Day ReckoningAI And Cybersecurity: The New Administration's 100-Day ReckoningMay 03, 2025 am 11:07 AM

At the RSAC 2025 conference this week, Snyk hosted a timely panel titled “The First 100 Days: How AI, Policy & Cybersecurity Collide,” featuring an all-star lineup: Jen Easterly, former CISA Director; Nicole Perlroth, former journalist and partne

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

DVWA

DVWA

Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

SAP NetWeaver Server Adapter for Eclipse

SAP NetWeaver Server Adapter for Eclipse

Integrate Eclipse with SAP NetWeaver application server.

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)