Home >Technology peripherals >AI >Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

Jennifer Aniston
Jennifer AnistonOriginal
2025-02-28 16:33:10837browse

Deploying the DeepSeek R1 Model on Databricks: A Step-by-Step Guide

Databricks, a popular data engineering platform, is increasingly used for AI and machine learning tasks. This tutorial guides you through deploying the distributed DeepSeek R1 model on Databricks, a powerful large language model often preferred for on-premise deployment. This avoids sending data to external servers. For a deeper dive into DeepSeek R1's features and comparisons, see the DeepSeek-R1: Features, Comparison, Distilled Models & More blog.

This guide covers account setup, model registration using the UI, and access via the playground and local CURL commands. New to Databricks? The Introduction to Databricks course provides a comprehensive overview of the Databricks Lakehouse platform and its data management capabilities. For a deeper understanding of data management within Databricks, consider the Data Management in Databricks course.

Registering the DeepSeek R1 Model

  1. Launch a Notebook: After creating your Databricks workspace, click " New" and select a notebook.

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

  1. Install Packages: Install necessary Python libraries:
<code class="language-python">%%capture
!pip install torch transformers mlflow accelerate torchvision
%restart_python</code>
  1. Load Model and Tokenizer: Load the DeepSeek R1 model and tokenizer from Hugging Face:
<code class="language-python">import pandas as pd
import mlflow
import mlflow.transformers
import torch
from mlflow.models.signature import infer_signature
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig, pipeline

model_name = "deepseek-ai/DeepSeek-R1-Distill-Llama-8B"
tokenizer = AutoTokenizer.from_pretrained(model_name)
config = AutoConfig.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, config=config, torch_dtype=torch.float16)</code>

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

  1. Test the Model: Test with a sample prompt and generate a signature for model registration:
<code class="language-python">text_generator = pipeline("text-generation", model=model, tokenizer=tokenizer)
example_prompt = "How does a computer work?"
example_inputs = pd.DataFrame({"inputs": [example_prompt]})
example_outputs = text_generator(example_prompt, max_length=200)
signature = infer_signature(example_inputs, example_outputs)
print(example_outputs)</code>

Expected output (may vary slightly):

<code>[{'generated_text': "How does a computer work? What is the computer? What is the computer used for? What is the computer used for in real life?\n\nI need to answer this question, but I need to do it step by step. I need to start with the very basic level and build up from there. I need to make sure I understand each concept before moving on. I need to use a lot of examples to explain each idea. I need to write my thoughts as if I'm explaining them to someone else, but I need to make sure I understand how to structure the answer properly.\n\nOkay, let's start with the basic level. What is a computer? It's an electronic device, right? And it has a central processing unit (CPU) that does the processing. But I think the central processing unit is more efficient, so maybe it's the CPU. Then, it has memory and storage. I remember that memory is like RAM and storage is like ROM. But wait, I think"}]</code>
  1. Conda Environment: Define a conda environment:
<code class="language-python">conda_env = {
    "name": "mlflow-env",
    "channels": ["defaults", "conda-forge"],
    "dependencies": [
        "python=3.11",
        "pip",
        {"pip": ["mlflow", "transformers", "accelerate", "torch", "torchvision"]}
    ]
}</code>
  1. Register the Model: Register the model using mlflow.transformers.log_model:
<code class="language-python">with mlflow.start_run() as run:
    mlflow.transformers.log_model(
        transformers_model=text_generator,
        artifact_path="deepseek_model",
        signature=signature,
        input_example=example_inputs,
        registered_model_name="deepseek_r1_llama_8b",
        conda_env=conda_env
    )</code>

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

Deploying DeepSeek R1

  1. Navigate to Models: In the Databricks dashboard, go to the "Models" tab.

  2. Serve the Model: Select your model and click "Serve this model."

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

  1. Configure Endpoint: Name your endpoint, choose compute options, set concurrency, and click "Create."

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

For fine-tuning on a custom dataset, refer to the Fine-Tuning DeepSeek R1 tutorial.

Accessing the Deployed Model

  1. Databricks Playground: Test directly in the Databricks playground.

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

  1. CURL Command: Generate a Databricks API key (Settings > Developer), set it as an environment variable ($DATABRICKS_TOKEN), and use CURL:
<code class="language-python">%%capture
!pip install torch transformers mlflow accelerate torchvision
%restart_python</code>

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide

For information on DeepSeek R1 vs. V3, see the DeepSeek R1 vs V3 blog. New to LLMs? The Introduction to LLMs in Python course is a great starting point. Remember that while CPU deployment is possible, it might be slower.

The above is the detailed content of Deploying DeepSeek R1 on Databricks: A Step-by-Step Guide. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn