Home >Backend Development >Golang >How to Efficiently Download Large Files in Go Without Filling Up Memory?
Efficiently Downloading Large Files Using Go
Storing large files in memory can overwhelm a computer's resources. In Go, how can we download such files directly, avoiding this memory constraint?
Answer:
Assuming the download is via HTTP:
import ( "net/http" "io" "os" ) func DownloadFile(url, filepath string) (int64, error) { // Open the destination file out, err := os.Create(filepath) if err != nil { return 0, err } defer out.Close() // Start the HTTP request resp, err := http.Get(url) if err != nil { return 0, err } defer resp.Body.Close() // Stream the response body into the file (avoiding full buffering) n, err := io.Copy(out, resp.Body) if err != nil { return 0, err } return n, nil }
This approach utilizes HTTP's response body as a Reader. Functions like io.Copy() can operate on this Reader, allowing us to process the response in chunks.
The above is the detailed content of How to Efficiently Download Large Files in Go Without Filling Up Memory?. For more information, please follow other related articles on the PHP Chinese website!