Home >Web Front-end >CSS Tutorial >Web Streams Everywhere (and Fetch for Node.js)
Jake Archibald's 2016 prediction of "the year of web streams" might have been slightly ahead of its time. However, the Streams Standard, initially proposed in 2014, is now a reality, consistently implemented across modern browsers (with Firefox catching up) and Node.js (and Deno).
Streaming efficiently handles large data resources by breaking them into smaller "chunks" and processing them sequentially. This avoids waiting for a complete download before processing begins, enabling progressive data handling.
Three main stream types exist: readable, writable, and transform streams. Readable streams provide data chunks (from sources like files or HTTP connections). Transform streams (optional) modify these chunks. Finally, writable streams receive the processed data.
Node.js initially had its own stream implementation, often considered complex. The WHATWG web standard for streams offers a significant improvement, now referred to as "web streams" in Node.js documentation. While the original Node.js streams remain, the web standard API coexists, promoting cross-platform code and simplifying development.
Deno, also created by Node.js's original author, fully supports web streams, mirroring browser APIs. Cloudflare Workers and Deno Deploy also leverage this standardized approach.
fetch()
and Readable StreamsThe most common way to create a readable stream is using fetch()
. The response.body
of a fetch()
call is a readable stream.
fetch('data.txt') .then(response => console.log(response.body));
The console log reveals several useful stream methods. As the specification states, a readable stream can be directly piped to a writable stream using pipeTo()
, or piped through transform streams using pipeThrough()
.
Node.js core lacks built-in fetch
support. node-fetch
(a popular library) returns a Node stream, not a WHATWG stream. Undici
, a newer HTTP/1.1 client from the Node.js team, offers a modern alternative to http.request
, providing a fetch
implementation where response.body
does return a web stream. Undici
is likely to become the recommended HTTP request handler in Node.js. Once installed (npm install undici
), it functions similarly to browser fetch
. The following example pipes a stream through a transform stream:
import { fetch } from 'undici'; import { TextDecoderStream } from 'node:stream/web'; async function fetchStream() { const response = await fetch('https://example.com'); const stream = response.body; const textStream = stream.pipeThrough(new TextDecoderStream()); // ... further processing of textStream ... }
response.body
is synchronous; await
isn't needed. Browser code is almost identical, omitting the import
statements as fetch
and TextDecoderStream
are globally available. Deno also has native support.
The for-await-of
loop provides asynchronous iteration, extending the for-of
loop's functionality to asynchronous iterables (like streams and arrays of promises).
async function fetchAndLogStream() { const response = await fetch('https://example.com'); const stream = response.body; const textStream = stream.pipeThrough(new TextDecoderStream()); for await (const chunk of textStream) { console.log(chunk); } } fetchAndLogStream();
This works in Node.js, Deno, and modern browsers (though browser stream support is still developing).
While fetch()
is prevalent, other methods create readable streams: Blob.stream()
and File.stream()
(requiring import { Blob } from 'buffer';
in Node.js). In browsers, an <input type="file">
element easily provides a file stream:
const fileStream = document.querySelector('input').files[0].stream();
Node.js 17 introduces FileHandle.readableWebStream()
from fs/promises.open()
:
import { open } from 'node:fs/promises'; // ... (open file and process stream) ...
For post-stream completion actions, promises are useful:
someReadableStream .pipeTo(someWritableStream) .then(() => console.log("Data written")) .catch(error => console.error("Error", error));
Or using await
:
await someReadableStream.pipeTo(someWritableStream);
Beyond TextDecoderStream
(and TextEncoderStream
), you can create custom transform streams using TransformStream
. The constructor accepts an object with optional start
, transform
, and flush
methods. transform
performs the data transformation.
Here's a simplified (for illustrative purposes; use TextDecoderStream
in production) text decoder:
const decoder = new TextDecoder(); const decodeStream = new TransformStream({ transform(chunk, controller) { controller.enqueue(decoder.decode(chunk, {stream: true})); } });
Similarly, you can create custom readable streams using ReadableStream
, providing start
, pull
, and cancel
functions. The start
function uses controller.enqueue
to add chunks.
Node.js provides .fromWeb()
and .toWeb()
methods (in Node.js 17 ) for converting between Node streams and web streams.
The convergence of browser and Node.js APIs continues, with streams being a key part of this unification. While full front-end stream adoption is still underway (e.g., MediaStream
isn't a readable stream yet), the future points towards broader stream utilization. The potential for efficient I/O and cross-platform development makes learning web streams worthwhile.
The above is the detailed content of Web Streams Everywhere (and Fetch for Node.js). For more information, please follow other related articles on the PHP Chinese website!