Home > Article > Backend Development > How to Stream Large Files to AWS S3 with Go?
Stream File Upload to AWS S3 Using Go
Challenge:
Stream a large multipart/form-data file directly to AWS S3, minimizing memory and disk footprint.
Solution:
To accomplish this, we'll use the S3 Uploader from the github.com/aws/aws-sdk-go library.
Implementation:
Example Code:
import ( "github.com/aws/aws-sdk-go/aws/session" "github.com/aws/aws-sdk-go/service/s3/s3manager" ) func main() { // Create an S3 uploader with custom options uploader := s3manager.NewUploader(session.Must(session.NewSession()), func(u *s3manager.Uploader) { u.PartSize = 5 * 1024 * 1024 // 5MB part size u.Concurrency = 2 // 2 concurrent uploads }) // Open the file for upload f, err := os.Open("file.txt") if err != nil { panic(err) } defer f.Close() // Stream file to S3 result, err := uploader.Upload(&s3manager.UploadInput{ Bucket: aws.String("my-bucket"), Key: aws.String("file.txt"), Body: f, }) if err != nil { panic(err) } // Display uploaded file information fmt.Printf("File uploaded to: %s", result.Location) }
The above is the detailed content of How to Stream Large Files to AWS S3 with Go?. For more information, please follow other related articles on the PHP Chinese website!