Home >Web Front-end >JS Tutorial >Understanding Streams in Node.js
Related recommendations: "nodejs Tutorial"
For most students with back-end experience, the Stream object is a reasonable and common object, but front-end classmate Stream is not so taken for granted. There is even an article with more than 9,000 stars on github introducing what Stream is - stream-handbook (https://link.zhihu.com/?target=https: //github.com/substack/stream-handbook). In order to better understand Stream, let’s briefly summarize it based on this article.
Stream is a very common and important concept in Unix systems. In terminology, stream is an abstraction of input and output devices.
ls | grep *.js
We often encounter codes like this when writing scripts. Use |
to connect two commands and use the result of the previous command as the next one. The parameters of the command are passed in, so that the data is transmitted like water in the pipeline. Each command is like a processor, doing some processing on the data, so | is called "Pipeline symbol ".
From a program perspective, a stream is directional data, which can be divided into three types according to the flow direction
Device flow to program: readable
Program flow to device: writable
Bidirectional: duplex, transform
NodeJS’s stream operations are encapsulated into the Stream module, which is also referenced by multiple core modules. According to the Unix philosophy: everything is a file, most file processing in NodeJS uses streams to complete
Ordinary files
Device files ( stdin, stdout)
Network files (http, net)
There is a knowledge point that is easily overlooked: all Streams in NodeJS All are instances of EventEmitter.
When we write a program, we suddenly need to read a certain configuration file config.json. At this time, we will briefly analyze the
We should use readable stream to do this
const fs = require('fs'); const FILEPATH = '...'; const rs = fs.createReadStream(FILEPATH);
Through the createReadStream()
method provided by the fs module, we easily create a readable stream. At this time, the content of config.json flows from the device to the program. We do not use the Stream module directly because fs has already referenced the Stream module internally and encapsulated it.
After we have the data, we need to process it. For example, we need to write to a certain path DEST. At this time, we need a writable stream to allow the data to flow from the program to the device.
const ws = fs.createWriteStream(DEST);
Now that we have two streams, that is, two data processors, how do we connect the streams through the Unix-like pipe symbol |
? The pipe symbol in NodeJS is the pipe()
method.
const fs = require('fs'); const FILEPATH = '...'; const rs = fs.createReadStream(FILEPATH); const ws = fs.createWriteStream(DEST); rs.pipe(ws);
In this way, we use the stream to implement a simple file copy function. The implementation principle of the pipe() method will be mentioned later, but there is one thing worth noting: the data must be piped from the upstream to the downstream, that is Pipe from a readable stream to a writable stream.
The readable and writable streams mentioned above, we call them processors, which is actually not appropriate because we are not processing anything, we are just reading the data, and then Storing data.
If there is a need, change all the letters in the local package.json file to lowercase and save it to the package-lower.json file in the same directory.
At this time we need to use a two-way stream. Assume that we have a stream lower that specializes in converting characters to lowercase. Then the code written is probably like this
const fs = require('fs'); const rs = fs.createReadStream('./package.json'); const ws = fs.createWriteStream('./package-lower.json'); rs.pipe(lower).pipe(ws);
At this time we can You can see why the stream connected by pipe() is called a processor. According to what was said above, the pipe must be from a readable stream to a writable stream:
It’s a bit of reasoning, but it can satisfy The lower we need must be a bidirectional flow. We will mention the specific use of duplex or transform later.
Of course, if we have some additional processing actions, such as letters that need to be converted into ASCII codes, assuming there is an ascii stream, then our code may be
rs.pipe(lower).pipe(acsii).pipe(ws);
Similarly, ascii must also be a bidirectional stream . The logic of this processing is very clear, so in addition to clear code, what are the benefits of using streams?
There is a scenario where a user needs to watch a video online. Assume that we return movie content to the user through an HTTP request, then the code may be written like this
const http = require('http'); const fs = require('fs'); http.createServer((req, res) => { fs.readFile(moviePath, (err, data) => { res.end(data); }); }).listen(8080);
这样的代码又两个明显的问题
电影文件需要读完之后才能返回给客户,等待时间超长
电影文件需要一次放入内存中,相似动作多了,内存吃不消
用流可以讲电影文件一点点的放入内存中,然后一点点的返回给客户(利用了 HTTP 协议的 Transfer-Encoding: chunked 分段传输特性),用户体验得到优化,同时对内存的开销明显下降
const http = require('http'); const fs = require('fs'); http.createServer((req, res) => { fs.createReadStream(moviePath).pipe(res); }).listen(8080);
除了上述好处,代码优雅了很多,拓展也比较简单。比如需要对视频内容压缩,我们可以引入一个专门做此事的流,这个流不用关心其它部分做了什么,只要是接入管道中就可以了
const http = require('http'); const fs = require('fs'); const oppressor = require(oppressor); http.createServer((req, res) => { fs.createReadStream(moviePath) .pipe(oppressor) .pipe(res); }).listen(8080);
可以看出来,使用流后,我们的代码逻辑变得相对独立,可维护性也会有一定的改善,关于几种流的具体使用方式且听下回分解。
更多编程相关知识,请访问:编程视频课程!!
The above is the detailed content of Understanding Streams in Node.js. For more information, please follow other related articles on the PHP Chinese website!