Home > Article > Web Front-end > How to Efficiently Read JSON Files into Server Memory with JavaScript/Node?
Efficiently Reading JSON Files into Server Memory with JavaScript/Node
When experimenting with Node.js, the need to swiftly access JSON objects from memory arises. To achieve this, one can read JSON objects from either text files or .js files.
For synchronous file reading, utilize the following code:
<code class="js">const fs = require('fs'); const obj = JSON.parse(fs.readFileSync('file', 'utf8'));</code>
This approach loads the JSON object into memory at once, providing immediate access.
Alternatively, for asynchronous file reading, consider the following:
<code class="js">const fs = require('fs'); let obj; fs.readFile('file', 'utf8', (err, data) => { if (err) throw err; obj = JSON.parse(data); });</code>
The asynchronous approach involves an event-driven callback mechanism. Once the file is read, the JSON object is parsed and becomes available.
The choice between JSON text files and .js files for storing JSON data is subjective. Text files are lightweight and portable, while .js files can be directly imported as modules using Node's require() function. Both have their own advantages depending on specific requirements.
The above is the detailed content of How to Efficiently Read JSON Files into Server Memory with JavaScript/Node?. For more information, please follow other related articles on the PHP Chinese website!