Home >Web Front-end >JS Tutorial >Streams in Node.js - Tutorial - Part 7
Streams in Node.js are a powerful way to handle I/O operations efficiently, especially when working with large amounts of data. Instead of reading and writing data all at once, streams allow us to process it in chunks, which improves performance and reduces memory consumption.
Node.js provides four types of streams:
|
Description |
Example | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Readable Streams |
Used for reading data | Reading from a file | |||||||||||||||
|
Used for writing data | Writing to a file | |||||||||||||||
Duplex Streams | Both readable and writable | Sockets | |||||||||||||||
Transform Streams |
A type of duplex stream where data can be modified as it is read or written | Compression |
end
const fs = require('fs'); const readableStream = fs.createReadStream('example.txt', { encoding: 'utf8' }); readableStream.on('data', (chunk) => { console.log('Received chunk:', chunk); }); readableStream.on('end', () => { console.log('No more data.'); }); readableStream.on('error', (err) => { console.error('Error:', err); });
const fs = require('fs'); const writableStream = fs.createWriteStream('output.txt'); writableStream.write('Hello, Node.js streams!\n'); writableStream.end(); // Close the stream writableStream.on('finish', () => { console.log('Finished writing.'); }); writableStream.on('error', (err) => { console.error('Error:', err); });
readableStream.pipe(writableStream);
Streams help in processing large amounts of data efficiently. For example, when working with files, streams allow you to avoid loading the entire file into memory. This is particularly useful when handling media files, big datasets, or data from HTTP requests.
Final Tips
The above is the detailed content of Streams in Node.js - Tutorial - Part 7. For more information, please follow other related articles on the PHP Chinese website!