Home >Backend Development >Python Tutorial >Experiment with Chainlit AI interface with RAG on Upsun
Chainlit: A Scalable Conversational AI Framework
Chainlit is an open-source, asynchronous Python framework designed for building robust and scalable conversational AI applications. It offers a flexible foundation, allowing developers to integrate external APIs, custom logic, and local models seamlessly.
This tutorial demonstrates two Retrieval Augmented Generation (RAG) implementations within Chainlit:
Local Chainlit Setup
Create a virtual environment:
<code class="language-bash">mkdir chainlit && cd chainlit python3 -m venv venv source venv/bin/activate</code>
Install required packages and save dependencies:
<code class="language-bash">pip install chainlit pip install llama_index # For implementation #2 pip install openai pip freeze > requirements.txt</code>
Start Chainlit:
<code class="language-bash">chainlit hello</code>
Access the placeholder at https://www.php.cn/link/2674cea93e3214abce13e072a2dc2ca5
Upsun Deployment
Initialize a Git repository:
<code class="language-bash">git init .</code>
Create a .gitignore
file:
<code>.env database/** data/** storage/** .chainlit venv __pycache__</code>
Create an Upsun project using the CLI (follow prompts). Upsun will automatically configure the remote repository.
Example Upsun configuration for Chainlit:
<code class="language-yaml">applications: chainlit: source: root: "/" type: "python:3.11" mounts: "/database": source: "storage" source_path: "database" ".files": source: "storage" source_path: "files" "__pycache__": source: "storage" source_path: "pycache" ".chainlit": source: "storage" source_path: ".chainlit" web: commands: start: "chainlit run app.py --port $PORT --host 0.0.0.0" upstream: socket_family: tcp locations: "/": passthru: true "/public": passthru: true build: flavor: none hooks: build: | set -eux pip install -r requirements.txt deploy: | set -eux # post_deploy: | routes: "https://{default}/": type: upstream upstream: "chainlit:http" "https://www.{default}": type: redirect to: "https://{default}/"</code>
Set the OPENAI_API_KEY
environment variable via Upsun CLI:
<code class="language-bash">upsun variable:create env:OPENAI_API_KEY --value=sk-proj[...]</code>
Commit and deploy:
<code class="language-bash">git add . git commit -m "First chainlit example" upsun push</code>
Review the deployment status. Successful deployment will show Chainlit running on your main environment.
Implementation 1: OpenAI Assistant & Uploaded Files
This implementation uses an OpenAI assistant to process uploaded documents.
Create a new OpenAI assistant on the OpenAI Platform. Set system instructions, choose a model (with text response format), and keep the temperature low (e.g., 0.10). Copy the assistant ID (asst_[xxx]
) and set it as an environment variable:
<code class="language-bash">upsun variable:create env:OPENAI_ASSISTANT_ID --value=asst_[...]</code>
Upload your documents (Markdown preferred) to the assistant. OpenAI will create a vector store.
Replace app.py
content with the provided code. Key parts: @cl.on_chat_start
creates a new OpenAI thread, and @cl.on_message
sends user messages to the thread and streams the response.
Commit and deploy the changes. Test the assistant.
Implementation 2: OpenAI llama_index
This implementation uses llama_index for local knowledge management and OpenAI for response generation.
Create a new branch:
<code class="language-bash">mkdir chainlit && cd chainlit python3 -m venv venv source venv/bin/activate</code>
Create data
and storage
folders. Add mounts to the Upsun configuration.
Update app.py
with the provided llama_index code. This code loads documents, creates a VectorStoreIndex, and uses it to answer queries via OpenAI.
Deploy the new environment and upload the data
folder. Test the application.
Bonus: Authentication
Add authentication using a SQLite database.
Create a database
folder and add a mount to the Upsun configuration. Create an environment variable for the database path:
<code class="language-bash">pip install chainlit pip install llama_index # For implementation #2 pip install openai pip freeze > requirements.txt</code>
Add authentication logic to app.py
using @cl.password_auth_callback
. This adds a login form.
Create a script to generate hashed passwords. Add users to the database (using hashed passwords). Deploy the authentication and test login.
Conclusion
This tutorial demonstrated deploying a Chainlit application on Upsun with two RAG implementations and authentication. The flexible architecture allows for various adaptations and integrations.
The above is the detailed content of Experiment with Chainlit AI interface with RAG on Upsun. For more information, please follow other related articles on the PHP Chinese website!