search
HomeTechnology peripheralsAI3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

Musk promised to open source Grok-1, and the open source community was ecstatic.

But it is still a bit difficult to make changes or commercialize based on Grok-1:

Grok-1 is built using Rust JAX, for those who are used to mainstream software ecosystems such as Python PyTorch HuggingFace The threshold for users to get started is high.

3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

△Picture note: Grok ranks first in the world on GitHub’s popularity list

The latest achievements of the Colossal-AI team solve everyone’s urgent needs and provide convenience The easy-to-use Python PyTorch HuggingFace Grok-1 can accelerate the inference delay by nearly 4 times!

Now, the model has been published on HuggingFace and ModelScope.

HuggingFace download link:
https://www.php.cn/link/335396ce0d3f6e808c26132f91916eae

ModelScope download link:
https: //www.php.cn/link/7ae7778c9ae86d2ded133e891995dc9e

Performance Optimization

Combined with Colossal-AI’s rich accumulation in the field of AI large model system optimization, it has quickly supported Grok-1 Tensor parallelism.

On a single 8H800 80GB server, the inference performance is compared to JAX, HuggingFace's auto device map and other methods, the inference latency is accelerated by nearly 4 times.

3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

Usage Tutorial

After downloading and installing Colossal-AI, just start the inference script.

./run_inference_fast.sh hpcaitech/grok-1

Model weights will be automatically downloaded and loaded, and inference results will remain aligned. As shown in the figure below, the running test of Grok-1 greedy search.

3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

For more details, please refer to the grok-1 usage example:
https://www.php.cn/link/e2575ed7d2c481c414c10e688bcbc4cf

Monster Grok-1

This open source, xAI released the basic model weights and network architecture of Grok-1.

Specifically, the original base model from the pre-training phase in October 2023, which was not fine-tuned for any specific application (such as dialogue).

Structurally, Grok-1 adopts a mixed expert (MoE) architecture, contains 8 experts, and the total parameter amount is 314B (314 billion). When processing Token, two of the experts will be activated, and the activation parameter amount is 86B.

Just looking at the amount of activated parameters, it has exceeded the 70B of the dense model Llama 2. For the MoE architecture, it is not an exaggeration to call this amount of parameters a behemoth.

More parameter information is as follows:

  • The window length is 8192tokens, the precision is bf16
  • Tokenizer vocab size is 131072 (2^17), which is the same as GPT-4 Close;
  • embedding size is 6144 (48×128);
  • The number of Transformer layers is 64, each layer has a decoder layer, including multi-head attention blocks and dense blocks;
  • The key value size is 128;
  • In the multi-head attention block, 48 heads are used for query, 8 are used for KV, and the KV size is 128;
  • Dense block (dense Feedforward block) expansion factor is 8, and the hidden layer size is 32768

3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

On the GitHub page, the official tip is that due to the large model size (314B parameters), A machine with sufficient GPU and memory is required to run Grok.

The implementation efficiency of the MoE layer here is not high. This implementation method was chosen to avoid the need to customize the kernel when verifying the correctness of the model.

The weight file of the model is provided in the form of magnetic link, and the file size is close to 300GB.

3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here

It is worth mentioning that Grok-1 uses the Apache 2.0 license, Commercial friendly.

Currently, the star rating of Grok-1 on GitHub has reached 43.9k Stars.

Qubit understands that Colossal-AI will further launch optimizations for Grok-1 in the near future such as parallel acceleration and quantitative reduction of graphics memory costs. Welcome to continue to pay attention.

Colossal-AI open source address: https://www.php.cn/link/b9531e7d2a8f38fe8dcc73f58cae9530

The above is the detailed content of 3140 parameters Grok-1 inference accelerated by 3.8 times, PyTorch+HuggingFace version is here. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
How to Build Your Personal AI Assistant with Huggingface SmolLMHow to Build Your Personal AI Assistant with Huggingface SmolLMApr 18, 2025 am 11:52 AM

Harness the Power of On-Device AI: Building a Personal Chatbot CLI In the recent past, the concept of a personal AI assistant seemed like science fiction. Imagine Alex, a tech enthusiast, dreaming of a smart, local AI companion—one that doesn't rely

AI For Mental Health Gets Attentively Analyzed Via Exciting New Initiative At Stanford UniversityAI For Mental Health Gets Attentively Analyzed Via Exciting New Initiative At Stanford UniversityApr 18, 2025 am 11:49 AM

Their inaugural launch of AI4MH took place on April 15, 2025, and luminary Dr. Tom Insel, M.D., famed psychiatrist and neuroscientist, served as the kick-off speaker. Dr. Insel is renowned for his outstanding work in mental health research and techno

The 2025 WNBA Draft Class Enters A League Growing And Fighting Online HarassmentThe 2025 WNBA Draft Class Enters A League Growing And Fighting Online HarassmentApr 18, 2025 am 11:44 AM

"We want to ensure that the WNBA remains a space where everyone, players, fans and corporate partners, feel safe, valued and empowered," Engelbert stated, addressing what has become one of women's sports' most damaging challenges. The anno

Comprehensive Guide to Python Built-in Data Structures - Analytics VidhyaComprehensive Guide to Python Built-in Data Structures - Analytics VidhyaApr 18, 2025 am 11:43 AM

Introduction Python excels as a programming language, particularly in data science and generative AI. Efficient data manipulation (storage, management, and access) is crucial when dealing with large datasets. We've previously covered numbers and st

First Impressions From OpenAI's New Models Compared To AlternativesFirst Impressions From OpenAI's New Models Compared To AlternativesApr 18, 2025 am 11:41 AM

Before diving in, an important caveat: AI performance is non-deterministic and highly use-case specific. In simpler terms, Your Mileage May Vary. Don't take this (or any other) article as the final word—instead, test these models on your own scenario

AI Portfolio | How to Build a Portfolio for an AI Career?AI Portfolio | How to Build a Portfolio for an AI Career?Apr 18, 2025 am 11:40 AM

Building a Standout AI/ML Portfolio: A Guide for Beginners and Professionals Creating a compelling portfolio is crucial for securing roles in artificial intelligence (AI) and machine learning (ML). This guide provides advice for building a portfolio

What Agentic AI Could Mean For Security OperationsWhat Agentic AI Could Mean For Security OperationsApr 18, 2025 am 11:36 AM

The result? Burnout, inefficiency, and a widening gap between detection and action. None of this should come as a shock to anyone who works in cybersecurity. The promise of agentic AI has emerged as a potential turning point, though. This new class

Google Versus OpenAI: The AI Fight For StudentsGoogle Versus OpenAI: The AI Fight For StudentsApr 18, 2025 am 11:31 AM

Immediate Impact versus Long-Term Partnership? Two weeks ago OpenAI stepped forward with a powerful short-term offer, granting U.S. and Canadian college students free access to ChatGPT Plus through the end of May 2025. This tool includes GPT‑4o, an a

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Tools

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

mPDF

mPDF

mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools

MinGW - Minimalist GNU for Windows

MinGW - Minimalist GNU for Windows

This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.