Home >Backend Development >Python Tutorial >Experiment with Chainlit AI interface with RAG on Upsun

Experiment with Chainlit AI interface with RAG on Upsun

Patricia Arquette
Patricia ArquetteOriginal
2025-01-21 00:14:14372browse

Chainlit: A Scalable Conversational AI Framework

Chainlit is an open-source, asynchronous Python framework designed for building robust and scalable conversational AI applications. It offers a flexible foundation, allowing developers to integrate external APIs, custom logic, and local models seamlessly.

Experiment with Chainlit AI interface with RAG on Upsun

This tutorial demonstrates two Retrieval Augmented Generation (RAG) implementations within Chainlit:

  1. Leveraging OpenAI Assistants with uploaded documents.
  2. Utilizing llama_index with a local document folder.

Local Chainlit Setup

Virtual Environment

Create a virtual environment:

<code class="language-bash">mkdir chainlit && cd chainlit
python3 -m venv venv
source venv/bin/activate</code>

Install Dependencies

Install required packages and save dependencies:

<code class="language-bash">pip install chainlit
pip install llama_index  # For implementation #2
pip install openai
pip freeze > requirements.txt</code>

Test Chainlit

Start Chainlit:

<code class="language-bash">chainlit hello</code>

Access the placeholder at https://www.php.cn/link/2674cea93e3214abce13e072a2dc2ca5

Experiment with Chainlit AI interface with RAG on Upsun

Upsun Deployment

Git Initialization

Initialize a Git repository:

<code class="language-bash">git init .</code>

Create a .gitignore file:

<code>.env
database/**
data/**
storage/**
.chainlit
venv
__pycache__</code>

Upsun Project Creation

Create an Upsun project using the CLI (follow prompts). Upsun will automatically configure the remote repository.

Configuration

Example Upsun configuration for Chainlit:

<code class="language-yaml">applications:
  chainlit:
    source:
      root: "/"
    type: "python:3.11"
    mounts:
      "/database":
        source: "storage"
        source_path: "database"
      ".files":
        source: "storage"
        source_path: "files"
      "__pycache__":
        source: "storage"
        source_path: "pycache"
      ".chainlit":
        source: "storage"
        source_path: ".chainlit"
    web:
      commands:
        start: "chainlit run app.py --port $PORT --host 0.0.0.0"
      upstream:
        socket_family: tcp
      locations:
        "/":
          passthru: true
        "/public":
          passthru: true
    build:
      flavor: none
    hooks:
      build: |
        set -eux
        pip install -r requirements.txt
      deploy: |
        set -eux
      # post_deploy: |
routes:
  "https://{default}/":
    type: upstream
    upstream: "chainlit:http"
  "https://www.{default}":
    type: redirect
    to: "https://{default}/"</code>

Set the OPENAI_API_KEY environment variable via Upsun CLI:

<code class="language-bash">upsun variable:create env:OPENAI_API_KEY --value=sk-proj[...]</code>

Deployment

Commit and deploy:

<code class="language-bash">git add .
git commit -m "First chainlit example"
upsun push</code>

Review the deployment status. Successful deployment will show Chainlit running on your main environment.

Experiment with Chainlit AI interface with RAG on Upsun

Implementation 1: OpenAI Assistant & Uploaded Files

This implementation uses an OpenAI assistant to process uploaded documents.

Assistant Creation

Create a new OpenAI assistant on the OpenAI Platform. Set system instructions, choose a model (with text response format), and keep the temperature low (e.g., 0.10). Copy the assistant ID (asst_[xxx]) and set it as an environment variable:

<code class="language-bash">upsun variable:create env:OPENAI_ASSISTANT_ID --value=asst_[...]</code>

Content Upload

Upload your documents (Markdown preferred) to the assistant. OpenAI will create a vector store.

Experiment with Chainlit AI interface with RAG on Upsun

Experiment with Chainlit AI interface with RAG on Upsun

Assistant Logic (app.py)

Replace app.py content with the provided code. Key parts: @cl.on_chat_start creates a new OpenAI thread, and @cl.on_message sends user messages to the thread and streams the response.

Commit and deploy the changes. Test the assistant.

Experiment with Chainlit AI interface with RAG on Upsun

Implementation 2: OpenAI llama_index

This implementation uses llama_index for local knowledge management and OpenAI for response generation.

Branch Creation

Create a new branch:

<code class="language-bash">mkdir chainlit && cd chainlit
python3 -m venv venv
source venv/bin/activate</code>

Folder Creation and Mounts

Create data and storage folders. Add mounts to the Upsun configuration.

app.py Update

Update app.py with the provided llama_index code. This code loads documents, creates a VectorStoreIndex, and uses it to answer queries via OpenAI.

Deploy the new environment and upload the data folder. Test the application.

Experiment with Chainlit AI interface with RAG on Upsun

Bonus: Authentication

Add authentication using a SQLite database.

Database Setup

Create a database folder and add a mount to the Upsun configuration. Create an environment variable for the database path:

<code class="language-bash">pip install chainlit
pip install llama_index  # For implementation #2
pip install openai
pip freeze > requirements.txt</code>

Authentication Logic (app.py)

Add authentication logic to app.py using @cl.password_auth_callback. This adds a login form.

Create a script to generate hashed passwords. Add users to the database (using hashed passwords). Deploy the authentication and test login.

Experiment with Chainlit AI interface with RAG on Upsun

Conclusion

This tutorial demonstrated deploying a Chainlit application on Upsun with two RAG implementations and authentication. The flexible architecture allows for various adaptations and integrations.

The above is the detailed content of Experiment with Chainlit AI interface with RAG on Upsun. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Previous article:Copier vs CookiecutterNext article:Copier vs Cookiecutter