Node.js, being asynchronous and event-driven, excels at I/O-bound operations. Leveraging Node.js streams significantly simplifies these tasks by efficiently processing data in smaller chunks. Let's delve into the world of streams and see how they streamline I/O.
Key Concepts:
- Node.js streams, asynchronous and event-driven, optimize I/O by handling data in manageable portions.
- Streams are classified as Readable, Writable, or Duplex (both readable and writable). Readable streams fetch data from a source; writable streams send data to a destination.
- The
pipe()
function is invaluable, facilitating seamless data transfer between source and destination without manual flow management. - Methods like
Readable.pause()
,Readable.resume()
, andreadable.unpipe()
offer granular control over data flow, enhancing stream functionality.
Understanding Streams:
Streams are analogous to Unix pipes, enabling effortless data transfer from source to destination. Essentially, a stream is an EventEmitter
with specialized methods. The implemented methods determine whether a stream is Readable, Writable, or Duplex. Readable streams provide data input; writable streams handle data output.
You've likely encountered streams in Node.js already. In an HTTP server, the request is a readable stream, and the response is a writable stream. The fs
module provides both readable and writable file stream capabilities.
This article focuses on readable and writable streams; duplex streams are beyond its scope.
Readable Streams:
A readable stream reads data from a source (a file, in-memory buffer, or another stream). Being EventEmitter
s, they trigger various events. We utilize these events to interact with the streams.
Reading from Streams:
The most common approach is to listen for the data
event and attach a callback. When data is available, the data
event fires, executing the callback.
const fs = require('fs'); const readableStream = fs.createReadStream('file.txt'); let data = ''; readableStream.on('data', (chunk) => { data += chunk; }); readableStream.on('end', () => { console.log(data); });
fs.createReadStream()
creates a readable stream. Initially static, it begins flowing upon attaching a data
event listener. Data chunks are then passed to the callback. The frequency of data
events is determined by the stream implementation (e.g., an HTTP request might emit an event per few KB, while a file stream might emit per line).
The end
event signals the end of data.
Alternatively, repeatedly call read()
on the stream instance until all data is read:
const fs = require('fs'); const readableStream = fs.createReadStream('file.txt'); let data = ''; readableStream.on('data', (chunk) => { data += chunk; }); readableStream.on('end', () => { console.log(data); });
read()
retrieves data from the internal buffer. It returns null
when no data remains. The readable
event indicates data availability.
Setting Encoding:
Data is typically a Buffer
object. For strings, use Readable.setEncoding()
:
const fs = require('fs'); const readableStream = fs.createReadStream('file.txt'); let data = ''; let chunk; readableStream.on('readable', () => { while ((chunk = readableStream.read()) !== null) { data += chunk; } }); readableStream.on('end', () => { console.log(data); });
This interprets data as UTF-8, passing it as a string to the callback.
Piping:
Piping simplifies data transfer between source and destination:
const fs = require('fs'); const readableStream = fs.createReadStream('file.txt'); let data = ''; readableStream.setEncoding('utf8'); readableStream.on('data', (chunk) => { data += chunk; }); readableStream.on('end', () => { console.log(data); });
pipe()
handles data flow automatically.
Chaining:
Streams can be chained:
const fs = require('fs'); const readableStream = fs.createReadStream('file1.txt'); const writableStream = fs.createWriteStream('file2.txt'); readableStream.pipe(writableStream);
This decompresses input.txt.gz
and writes the result to output.txt
.
Additional Readable Stream Methods:
-
Readable.pause()
: Pauses the stream. -
Readable.resume()
: Resumes a paused stream. -
readable.unpipe()
: Removes destination streams from the pipe.
Writable Streams:
Writable streams send data to a destination. Like readable streams, they are EventEmitter
s.
Writing to Streams:
Use write()
to send data:
const fs = require('fs'); const zlib = require('zlib'); fs.createReadStream('input.txt.gz') .pipe(zlib.createGunzip()) .pipe(fs.createWriteStream('output.txt'));
write()
returns a boolean indicating success. If false, the stream is temporarily full; wait for the drain
event before writing more.
End of Data:
Call end()
to signal the end of data. The finish
event is emitted after all data is flushed. You cannot write after calling end()
.
Important Writable Stream Events:
-
error
: Indicates an error. -
pipe
: Emitted when a readable stream is piped. -
unpipe
: Emitted whenunpipe()
is called on the readable stream.
Conclusion:
Streams are a powerful feature in Node.js, enhancing I/O efficiency. Understanding streams, piping, and chaining enables writing clean, performant code.
Node.js Streams FAQ:
-
What are Node.js streams? They are objects that allow for efficient, incremental processing of data, avoiding loading entire datasets into memory.
-
Main types of Node.js streams? Readable, Writable, Duplex, and Transform.
-
Creating a Readable stream? Use
stream.Readable
and implement the_read
method. -
Common use cases for Readable streams? Reading large files, processing data from HTTP requests, real-time data handling.
-
Creating a Writable stream? Use
stream.Writable
and implement the_write
method. -
Common uses of Writable streams? Saving data to files, sending data to services.
-
Duplex stream? Combines Readable and Writable functionality.
-
Transform streams? Modify data as it passes through (e.g., compression, encryption).
-
Piping data between streams? Use the
.pipe()
method. -
Best practices for working with Node.js streams? Use them for large datasets, handle errors and backpressure, and consider
util.promisify
for promise-based operations.
The above is the detailed content of The Basics of Node.js Streams. For more information, please follow other related articles on the PHP Chinese website!

JavaScript's application in the real world includes front-end and back-end development. 1) Display front-end applications by building a TODO list application, involving DOM operations and event processing. 2) Build RESTfulAPI through Node.js and Express to demonstrate back-end applications.

The main uses of JavaScript in web development include client interaction, form verification and asynchronous communication. 1) Dynamic content update and user interaction through DOM operations; 2) Client verification is carried out before the user submits data to improve the user experience; 3) Refreshless communication with the server is achieved through AJAX technology.

Understanding how JavaScript engine works internally is important to developers because it helps write more efficient code and understand performance bottlenecks and optimization strategies. 1) The engine's workflow includes three stages: parsing, compiling and execution; 2) During the execution process, the engine will perform dynamic optimization, such as inline cache and hidden classes; 3) Best practices include avoiding global variables, optimizing loops, using const and lets, and avoiding excessive use of closures.

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

Python and JavaScript have their own advantages and disadvantages in terms of community, libraries and resources. 1) The Python community is friendly and suitable for beginners, but the front-end development resources are not as rich as JavaScript. 2) Python is powerful in data science and machine learning libraries, while JavaScript is better in front-end development libraries and frameworks. 3) Both have rich learning resources, but Python is suitable for starting with official documents, while JavaScript is better with MDNWebDocs. The choice should be based on project needs and personal interests.

The shift from C/C to JavaScript requires adapting to dynamic typing, garbage collection and asynchronous programming. 1) C/C is a statically typed language that requires manual memory management, while JavaScript is dynamically typed and garbage collection is automatically processed. 2) C/C needs to be compiled into machine code, while JavaScript is an interpreted language. 3) JavaScript introduces concepts such as closures, prototype chains and Promise, which enhances flexibility and asynchronous programming capabilities.

Different JavaScript engines have different effects when parsing and executing JavaScript code, because the implementation principles and optimization strategies of each engine differ. 1. Lexical analysis: convert source code into lexical unit. 2. Grammar analysis: Generate an abstract syntax tree. 3. Optimization and compilation: Generate machine code through the JIT compiler. 4. Execute: Run the machine code. V8 engine optimizes through instant compilation and hidden class, SpiderMonkey uses a type inference system, resulting in different performance performance on the same code.

JavaScript's applications in the real world include server-side programming, mobile application development and Internet of Things control: 1. Server-side programming is realized through Node.js, suitable for high concurrent request processing. 2. Mobile application development is carried out through ReactNative and supports cross-platform deployment. 3. Used for IoT device control through Johnny-Five library, suitable for hardware interaction.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

Dreamweaver Mac version
Visual web development tools

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.