search
HomeTechnology peripheralsAIGetting Started with Phi-2

This blog post delves into Microsoft's Phi-2 language model, comparing its performance to other models and detailing its training process. We'll also cover how to access and fine-tune Phi-2 using the Transformers library and a Hugging Face role-playing dataset.

Phi-2, a 2.7 billion-parameter model from Microsoft's "Phi" series, aims for state-of-the-art performance despite its relatively small size. It employs a Transformer architecture, trained on 1.4 trillion tokens from synthetic and web datasets focusing on NLP and coding. Unlike many larger models, Phi-2 is a base model without instruction fine-tuning or RLHF.

Two key aspects drove Phi-2's development:

  • High-Quality Training Data: Prioritizing "textbook-quality" data, including synthetic datasets and high-value web content, to instill common sense reasoning, general knowledge, and scientific understanding.
  • Scaled Knowledge Transfer: Leveraging knowledge from the 1.3 billion parameter Phi-1.5 model to accelerate training and boost benchmark scores.

For insights into building similar LLMs, consider the Master LLM Concepts course.

Phi-2 Benchmarks

Phi-2 surpasses 7B-13B parameter models like Llama-2 and Mistral across various benchmarks (common sense reasoning, language understanding, math, coding). Remarkably, it outperforms the significantly larger Llama-2-70B on multi-step reasoning tasks.

Getting Started with Phi-2

Image Source

This focus on smaller, easily fine-tuned models allows for deployment on mobile devices, achieving performance comparable to much larger models. Phi-2 even outperforms Google Gemini Nano 2 on Big Bench Hard, BoolQ, and MBPP benchmarks.

Getting Started with Phi-2

Image Source

Accessing Phi-2

Explore Phi-2's capabilities via the Hugging Face Spaces demo: Phi 2 Streaming on GPU. This demo offers basic prompt-response functionality.

Getting Started with Phi-2

New to AI? The AI Fundamentals skill track is a great starting point.

Let's use the transformers pipeline for inference (ensure you have the latest transformers and accelerate installed).

!pip install -q -U transformers
!pip install -q -U accelerate

from transformers import pipeline

model_name = "microsoft/phi-2"

pipe = pipeline(
    "text-generation",
    model=model_name,
    device_map="auto",
    trust_remote_code=True,
)

Generate text using a prompt, adjusting parameters like max_new_tokens and temperature. Markdown output is converted to HTML.

from IPython.display import Markdown

prompt = "Please create a Python application that can change wallpapers automatically."

outputs = pipe(
    prompt,
    max_new_tokens=300,
    do_sample=True,
    temperature=0.7,
    top_k=50,
    top_p=0.95,
)
Markdown(outputs[0]["generated_text"])

Phi-2's output is impressive, generating code with explanations.

Getting Started with Phi-2

Phi-2 Applications

Phi-2's compact size allows for use on laptops and mobile devices for Q&A, code generation, and basic conversations.

Fine-tuning Phi-2

This section demonstrates fine-tuning Phi-2 on the hieunguyenminh/roleplay dataset using PEFT.

Setup and Installation

!pip install -q -U transformers
!pip install -q -U accelerate

from transformers import pipeline

model_name = "microsoft/phi-2"

pipe = pipeline(
    "text-generation",
    model=model_name,
    device_map="auto",
    trust_remote_code=True,
)

Import necessary libraries:

from IPython.display import Markdown

prompt = "Please create a Python application that can change wallpapers automatically."

outputs = pipe(
    prompt,
    max_new_tokens=300,
    do_sample=True,
    temperature=0.7,
    top_k=50,
    top_p=0.95,
)
Markdown(outputs[0]["generated_text"])

Define variables for the base model, dataset, and fine-tuned model name:

%%capture
%pip install -U bitsandbytes
%pip install -U transformers
%pip install -U peft
%pip install -U accelerate
%pip install -U datasets
%pip install -U trl

Hugging Face Login

Login using your Hugging Face API token. (Replace with your actual token retrieval method).

from transformers import (
    AutoModelForCausalLM,
    AutoTokenizer,
    BitsAndBytesConfig,
    TrainingArguments,
    pipeline,
    logging,
)
from peft import (
    LoraConfig,
    PeftModel,
    prepare_model_for_kbit_training,
    get_peft_model,
)
import os, torch
from datasets import load_dataset
from trl import SFTTrainer

Getting Started with Phi-2

Loading the Dataset

Load a subset of the dataset for faster training:

base_model = "microsoft/phi-2"
dataset_name = "hieunguyenminh/roleplay"
new_model = "phi-2-role-play"

Loading Model and Tokenizer

Load the 4-bit quantized model for memory efficiency:

# ... (Method to securely retrieve Hugging Face API token) ...
!huggingface-cli login --token $secret_hf

Adding Adapter Layers

Add LoRA layers for efficient fine-tuning:

dataset = load_dataset(dataset_name, split="train[0:1000]")

Training

Set up training arguments and the SFTTrainer:

bnb_config = BitsAndBytesConfig(  
    load_in_4bit= True,
    bnb_4bit_quant_type= "nf4",
    bnb_4bit_compute_dtype= torch.bfloat16,
    bnb_4bit_use_double_quant= False,
)

model = AutoModelForCausalLM.from_pretrained(
    base_model,
    quantization_config=bnb_config,
    device_map="auto",
    trust_remote_code=True,
)

model.config.use_cache = False
model.config.pretraining_tp = 1

tokenizer = AutoTokenizer.from_pretrained(base_model, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token

Getting Started with Phi-2

Saving and Pushing the Model

Save and upload the fine-tuned model:

model = prepare_model_for_kbit_training(model)
peft_config = LoraConfig(
    r=16,
    lora_alpha=16,
    lora_dropout=0.05,
    bias="none",
    task_type="CAUSAL_LM",
    target_modules=[
        'q_proj',
        'k_proj',
        'v_proj',
        'dense',
        'fc1',
        'fc2',
    ]
)
model = get_peft_model(model, peft_config)

Getting Started with Phi-2

Image Source

Model Evaluation

Evaluate the fine-tuned model:

training_arguments = TrainingArguments(
    output_dir="./results", # Replace with your desired output directory
    num_train_epochs=1,
    per_device_train_batch_size=2,
    gradient_accumulation_steps=1,
    optim="paged_adamw_32bit",
    save_strategy="epoch",
    logging_steps=100,
    logging_strategy="steps",
    learning_rate=2e-4,
    fp16=False,
    bf16=False,
    group_by_length=True,
    disable_tqdm=False,
    report_to="none",
)

trainer = SFTTrainer(
    model=model,
    train_dataset=dataset,
    peft_config=peft_config,
    max_seq_length= 2048,
    dataset_text_field="text",
    tokenizer=tokenizer,
    args=training_arguments,
    packing= False,
)

trainer.train()

Getting Started with Phi-2

Conclusion

This tutorial provided a comprehensive overview of Microsoft's Phi-2, its performance, training, and fine-tuning. The ability to fine-tune this smaller model efficiently opens up possibilities for customized applications and deployments. Further exploration into building LLM applications using frameworks like LangChain is recommended.

The above is the detailed content of Getting Started with Phi-2. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
undress free porn AI tool websiteundress free porn AI tool websiteMay 13, 2025 am 11:26 AM

https://undressaitool.ai/ is Powerful mobile app with advanced AI features for adult content. Create AI-generated pornographic images or videos now!

How to create pornographic images/videos using undressAIHow to create pornographic images/videos using undressAIMay 13, 2025 am 11:26 AM

Tutorial on using undressAI to create pornographic pictures/videos: 1. Open the corresponding tool web link; 2. Click the tool button; 3. Upload the required content for production according to the page prompts; 4. Save and enjoy the results.

undress AI official website entrance website addressundress AI official website entrance website addressMay 13, 2025 am 11:26 AM

The official address of undress AI is:https://undressaitool.ai/;undressAI is Powerful mobile app with advanced AI features for adult content. Create AI-generated pornographic images or videos now!

How does undressAI generate pornographic images/videos?How does undressAI generate pornographic images/videos?May 13, 2025 am 11:26 AM

Tutorial on using undressAI to create pornographic pictures/videos: 1. Open the corresponding tool web link; 2. Click the tool button; 3. Upload the required content for production according to the page prompts; 4. Save and enjoy the results.

undressAI porn AI official website addressundressAI porn AI official website addressMay 13, 2025 am 11:26 AM

The official address of undress AI is:https://undressaitool.ai/;undressAI is Powerful mobile app with advanced AI features for adult content. Create AI-generated pornographic images or videos now!

UndressAI usage tutorial guide articleUndressAI usage tutorial guide articleMay 13, 2025 am 10:43 AM

Tutorial on using undressAI to create pornographic pictures/videos: 1. Open the corresponding tool web link; 2. Click the tool button; 3. Upload the required content for production according to the page prompts; 4. Save and enjoy the results.

[Ghibli-style images with AI] Introducing how to create free images with ChatGPT and copyright[Ghibli-style images with AI] Introducing how to create free images with ChatGPT and copyrightMay 13, 2025 am 01:57 AM

The latest model GPT-4o released by OpenAI not only can generate text, but also has image generation functions, which has attracted widespread attention. The most eye-catching feature is the generation of "Ghibli-style illustrations". Simply upload the photo to ChatGPT and give simple instructions to generate a dreamy image like a work in Studio Ghibli. This article will explain in detail the actual operation process, the effect experience, as well as the errors and copyright issues that need to be paid attention to. For details of the latest model "o3" released by OpenAI, please click here⬇️ Detailed explanation of OpenAI o3 (ChatGPT o3): Features, pricing system and o4-mini introduction Please click here for the English version of Ghibli-style article⬇️ Create Ji with ChatGPT

Explaining examples of use and implementation of ChatGPT in local governments! Also introduces banned local governmentsExplaining examples of use and implementation of ChatGPT in local governments! Also introduces banned local governmentsMay 13, 2025 am 01:53 AM

As a new communication method, the use and introduction of ChatGPT in local governments is attracting attention. While this trend is progressing in a wide range of areas, some local governments have declined to use ChatGPT. In this article, we will introduce examples of ChatGPT implementation in local governments. We will explore how we are achieving quality and efficiency improvements in local government services through a variety of reform examples, including supporting document creation and dialogue with citizens. Not only local government officials who aim to reduce staff workload and improve convenience for citizens, but also all interested in advanced use cases.

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

EditPlus Chinese cracked version

EditPlus Chinese cracked version

Small size, syntax highlighting, does not support code prompt function

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

SAP NetWeaver Server Adapter for Eclipse

SAP NetWeaver Server Adapter for Eclipse

Integrate Eclipse with SAP NetWeaver application server.