Home >Web Front-end >JS Tutorial >Mastering Node.js Streams and Pipelines
Streams are Node.js's superpower for handling large datasets efficiently. Let's dive into streams and pipelines.
const fs = require('fs'); const readStream = fs.createReadStream('big.file'); const writeStream = fs.createWriteStream('output.file'); readStream.on('data', (chunk) => { writeStream.write(chunk); }); readStream.on('end', () => { writeStream.end(); });
Pipelines simplify stream composition and error handling.
const { pipeline } = require('stream/promises'); const fs = require('fs'); const zlib = require('zlib'); async function compressFile(input, output) { await pipeline( fs.createReadStream(input), zlib.createGzip(), fs.createWriteStream(output) ); console.log('Compression complete'); } compressFile('big.file', 'big.file.gz').catch(console.error);
const { Transform } = require('stream'); const upperCaseTransform = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); } }); pipeline( process.stdin, upperCaseTransform, process.stdout ).catch(console.error);
Streams shine with large datasets or real-time data processing. Master them for scalable Node.js applications.
cheers?
The above is the detailed content of Mastering Node.js Streams and Pipelines. For more information, please follow other related articles on the PHP Chinese website!