Home >Backend Development >Python Tutorial >How to Efficiently Upload Large Files (≥3GB) to a FastAPI Backend?

How to Efficiently Upload Large Files (≥3GB) to a FastAPI Backend?

Mary-Kate Olsen
Mary-Kate OlsenOriginal
2024-11-28 09:46:12115browse

How to Efficiently Upload Large Files (≥3GB) to a FastAPI Backend?

How to Upload a Large File (≥3GB) to FastAPI backend?

Using Requests-Toolbelt

When using the requests-toolbelt library, be sure to specify both the filename and the Content-Type header when declaring the field for upload_file. Here's an example:

filename = 'my_file.txt'
m = MultipartEncoder(fields={'upload_file': (filename, open(filename, 'rb'))})
r = requests.post(
    url,
    data=m,
    headers={'Content-Type': m.content_type},
    verify=False,
)
print(r.request.headers)  # confirm that the 'Content-Type' header has been set.

Using Python Requests/HTTPX

Another option is to use Python's requests or HTTPX libraries, which can both handle streaming file uploads efficiently. Here are examples for each:

Using requests:

import requests

url = '...'
filename = '...'

with open(filename, 'rb') as file:
    r = requests.post(
        url,
        files={'upload_file': file},
        headers={'Content-Type': 'multipart/form-data'},
    )

Using HTTPX:

import httpx

url = '...'
filename = '...'

with open(filename, 'rb') as file:
    r = httpx.post(
        url,
        files={'upload_file': file},
    )

HTTPX automatically supports streaming file uploads, while requests require you to set the Content-Type header to 'multipart/form-data'.

Using FastAPI Stream() Method

FastAPI's .stream() method allows you to avoid loading a large file into memory by accessing the request body as a stream. To use this approach, follow these steps:

  1. Install the streaming-form-data library: This library provides a streaming parser for multipart/form-data data.
  2. Create a FastAPI endpoint: Use the .stream() method to parse the request body as a stream, and utilize the stream ing_form_data library to handle parsing multipart/form-data.
  3. Register Targets: Define FileTarget and ValueTarget objects to handle file and form data parsing, respectively.

Uploaded File Size Validation

To ensure that the uploaded file size does not exceed a specified limit, you can use a MaxSizeValidator. Here's an example:

from streaming_form_data import streaming_form_data
from streaming_form_data import MaxSizeValidator

FILE_SIZE_LIMIT = 1024 * 1024 * 1024  # 1 GB

def validate_file_size(chunk: bytes):
    if FILE_SIZE_LIMIT > 0:
        streaming_form_data.validators.MaxSizeValidator( FILE_SIZE_LIMIT). __call__(chunk)

Implementing the Endpoint

Here's an example endpoint that incorporates these techniques:

from fastapi import FastAPI, File, Request
from fastapi.responses import HTMLResponse
from streaming_form_data.targets import FileTarget, ValueTarget
from streaming_form_data import StreamingFormDataParser

app = FastAPI()

@app.post('/upload')
async def upload(request: Request):
    # Parse the HTTP headers to retrieve the boundary string.
    parser = StreamingFormDataParser(headers=request.headers)

    # Register FileTarget and ValueTarget objects.
    file_ = FileTarget()
    data = ValueTarget()
    parser.register('upload_file', file_)
    parser.register('data', data)

    async for chunk in request.stream():
        parser.data_received(chunk)

    # Validate file size (if necessary)
    validate_file_size(file_.content)

    # Process the uploaded file and data.
    return {'message': 'File uploaded successfully!'}

The above is the detailed content of How to Efficiently Upload Large Files (≥3GB) to a FastAPI Backend?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn