Home > Article > Backend Development > How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?
Streaming File Upload to AWS S3 with Minimal Memory and File Disk Footprint
Problem: You need to upload a large multipart/form-data file directly to AWS S3 while minimizing memory and file disk usage.
Solution: Utilize the AWS S3 Uploader with the following steps:
Example Code:
import ( "fmt" "os" "github.com/aws/aws-sdk-go/aws" "github.com/aws/aws-sdk-go/aws/credentials" "github.com/aws/aws-sdk-go/aws/session" "github.com/aws/aws-sdk-go/service/s3/s3manager" ) // Configure Amazon S3 credentials and connection var accessKey = "" var accessSecret = "" func main() { // Determine if your AWS credentials are configured globally var awsConfig *aws.Config if accessKey == "" || accessSecret == "" { // No credentials provided, load the default credentials awsConfig = &aws.Config{ Region: aws.String("us-west-2"), } } else { // Static credentials provided awsConfig = &aws.Config{ Region: aws.String("us-west-2"), Credentials: credentials.NewStaticCredentials(accessKey, accessSecret, ""), } } // Create an AWS session with the configured credentials sess := session.Must(session.NewSession(awsConfig)) // Create an uploader with customized configuration uploader := s3manager.NewUploader(sess, func(u *s3manager.Uploader) { u.PartSize = 5 * 1024 * 1024 // Set the part size to 5MB u.Concurrency = 2 // Set the concurrency to 2 }) // Open the file to be uploaded f, err := os.Open("file_name.zip") if err != nil { fmt.Printf("Failed to open file: %v", err) return } defer f.Close() // Upload the file to S3 result, err := uploader.Upload(&s3manager.UploadInput{ Bucket: aws.String("myBucket"), Key: aws.String("file_name.zip"), Body: f, }) if err != nil { fmt.Printf("Failed to upload file: %v", err) return } fmt.Printf("File uploaded to: %s", result.Location) }
By utilizing the S3 Uploader and streaming the file, you can minimize memory and file disk usage during the upload process, ensuring efficient handling of large files.
The above is the detailed content of How to Stream Large Files to AWS S3 with Minimal Memory and Disk Usage?. For more information, please follow other related articles on the PHP Chinese website!