Home >Backend Development >Golang >How to Efficiently Download and Save Images from URLs in Go?
Problem:
When attempting to retrieve an image from a URL and save it to a file, an error occurs: "cannot use m (type image.Image) as type []byte in function argument."
Analysis:
The original code converts the image into a Go image.Image object (m) which is an in-memory representation of the image. However, the ioutil.WriteFile() function expects a byte slice ([]byte).
Solution:
Instead of converting the image to an in-memory representation, we can directly copy the response body to the output file using the io.Copy function. Here's a modified version of the code:
package main import ( "fmt" "io" "log" "net/http" "os" ) func main() { url := "http://i.imgur.com/m1UIjW1.jpg" // don't worry about errors response, err := http.Get(url) if err != nil { log.Fatal(err) } defer response.Body.Close() //open a file for writing file, err := os.Create("/tmp/asdf.jpg") if err != nil { log.Fatal(err) } defer file.Close() // Use io.Copy to just dump the response body to the file. This supports huge files _, err = io.Copy(file, response.Body) if err != nil { log.Fatal(err) } fmt.Println("Success!") }
Explanation:
Additional Notes:
The above is the detailed content of How to Efficiently Download and Save Images from URLs in Go?. For more information, please follow other related articles on the PHP Chinese website!