Home >Backend Development >Golang >How can I efficiently process large JSON arrays without loading them entirely into memory in Go?
When dealing with massive JSON arrays stored in files, it's crucial to avoid loading the entire array into memory, as this can lead to out-of-memory errors. Instead, consider streaming the JSON data element by element.
One approach to achieve this is demonstrated in the encoding/json documentation:
package main import ( "encoding/json" "fmt" "log" "strings" ) func main() { const jsonStream = ` [ {"Name": "Ed", "Text": "Knock knock."}, {"Name": "Sam", "Text": "Who's there?"}, {"Name": "Ed", "Text": "Go fmt."}, {"Name": "Sam", "Text": "Go fmt who?"}, {"Name": "Ed", "Text": "Go fmt yourself!"} ] ` type Message struct { Name, Text string } dec := json.NewDecoder(strings.NewReader(jsonStream)) // read open bracket t, err := dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) // while the array contains values for dec.More() { var m Message // decode an array value (Message) err := dec.Decode(&m) if err != nil { log.Fatal(err) } fmt.Printf("%v: %v\n", m.Name, m.Text) } // read closing bracket t, err = dec.Token() if err != nil { log.Fatal(err) } fmt.Printf("%T: %v\n", t, t) }
In this sample code:
This approach allows you to stream a large JSON array without loading the entire data structure into memory, optimizing resource usage and enabling efficient processing.
The above is the detailed content of How can I efficiently process large JSON arrays without loading them entirely into memory in Go?. For more information, please follow other related articles on the PHP Chinese website!