Home  >  Article  >  Backend Development  >  How to Deploy an AI App (w/ Large Deps) to AWS Lambda

How to Deploy an AI App (w/ Large Deps) to AWS Lambda

DDD
DDDOriginal
2024-10-10 06:12:30775browse

How to Deploy an AI App (w/ Large Deps) to AWS Lambda

I recently spent two hours getting a simple LlamaIndex app to run on AWS Lambda. While the function itself consists of just a few lines of Python code (as shown below), managing dependencies and deployment can be tricky.

import json
from llama_index.llms.openai import OpenAI

llm=OpenAI(model="gpt-4o-mini")

def lambda_handler(event, context):
    response = llm.complete("What public transportation might be available in a city?")
    return {
        'statusCode': 200,
        'body': str(response),
    }

Here are some key tips that helped me:

First, install packages for the correct platform. It's important to install all packages for the "manylinux2014_x86_64" target platform. Otherwise, certain packages might be incompatible with the AWS Lambda runtime. To ensure compatibility, install dependencies with the following command:

pip install -r requirements.txt --platform manylinux2014_x86_64 --target ./deps --only-binary=:all:

Second, Lambda has a 250MB limit for the total code package size, which can quickly become an issue if you're using LlamaIndex or other large dependencies. If your package exceeds this limit, check the largest files in your dependencies folder:

du -h -d 2 | sort -hr | head -n20

In my case, I found that the pandas/tests directory was taking up about 35MB of space, which was unnecessary for my function, so I removed it to bring the package size back within the limit.

rm -r deps/pandas/tests

Then, zip everything up and upload via S3. After trimming unnecessary files, create a zip archive containing both your code and dependencies. Since Lambda's web console has a 50MB upload limit, you'll need to upload larger zip files to an S3 bucket and use the S3 URI to deploy the function.

zip -r test_lambda.zip data/ lambda_function.py
cd deps/
zip -r ../test_lambda.zip .

Finally, adjust your Lambda settings before deploying. By default, Lambda functions are allocated only 128MB of memory and a 3-second timeout, which is insufficient for many AI applications involving large dependencies and calling LLMs. I'd increase the memory to 512MB and extend the timeout to 30 seconds. Additionally, don't forget to set essential environment variables like your OpenAI API Key.

It took me quite a few tries to figure out the correct way to install packages and bundle everything together for Lambda. AWS Lambda is user-friendly for basic scripts, but once you add larger dependencies, things get more complicated.

Here's the final sequence of steps:

# Install dependencies
pip install -r requirements.txt --platform manylinux2014_x86_64 --target ./deps --only-binary=:all:

# Create a zip file for code and data
zip -r test_lambda.zip data/ lambda_function.py

# Include dependencies in the zip file, while removing large unused files
cd deps/
rm -r pandas/tests
zip -r ../test_lambda.zip .

p.s., I also tried deploying a similar function on DBOS Cloud, and it only took a single command:

dbos-cloud app deploy

In DBOS, the dependency management was handled automatically via the requirements.txt file, and environment variables were set in dbos-config.yaml. I might be biased, but I enjoy the simplicity of DBOS Cloud's deployment process.

The above is the detailed content of How to Deploy an AI App (w/ Large Deps) to AWS Lambda. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn