Home  >  Article  >  Web Front-end  >  An in-depth analysis of file flow in node.js

An in-depth analysis of file flow in node.js

青灯夜游
青灯夜游forward
2021-11-19 19:19:332255browse

This article will analyze the file flow in Nodejs, I hope it will be helpful to everyone!

An in-depth analysis of file flow in node.js

File stream

Since various media in the computer have different reading and storage speeds and different capacities, there may be a long-term problem with one of them during the operation. Waiting state

There are three main types of file streams, namely Input stream (Readable) , Output stream (Writeable) , Duplex stream (Duplex) . There is another type of stream that is not commonly used, which is Transform stream (Transform)

provides the stream module in node. There are two class instances in this module: Readable and Writable, these two classes will be inherited in the stream, so there will be many common methods.

Readable stream (Readable)

Input stream: Data flows from the source to the memory, and the data in the disk is transferred to the memory.

createReadStream

fs.createReadStream(path, configuration)

In the configuration there are: encoding (encoding method), start (start reading Bytes), end (end of reading bytes), highWaterMark (amount of each read)

highWaterMark: If encoding has a value, this number represents a number of characters; if encoding is null, this number represents a number of characters Number of sections

Returns a Readable subclass ReadStream

const readable = fs.createReadStream(filename, { encoding: 'utf-8', start: 1, end: 2, // highWaterMark: });

Register event

readable.on(event name, handler function)

readable.on('open', (err, data)=> {
    // console.log(err);
    console.log('文件打开了');
})

readable.on('error', (data, err) => {
    console.log(data, err);
    console.log('读取文件发生错误');
})

readable.on('close', (data, err) => {
    // console.log(data, err);
    console.log('文件关闭');
})

readable.close() // 手动触发通过 readable.close()或者在文件读取完毕之后自动关闭--autoClose配置项默认为 true

readable.on('data', (data) => {
    console.log(data);
    console.log('文件正在读取');
})

readable.on('end', ()=>{
    console.log('文件读取完毕');
})

Pause reading

readable.pause() Pauses reading and triggers the pause event

Resume reading

readable.resume() Resumes reading , will trigger the resume event

Writable stream

const ws = fs.createWriteStream(filename[, configuration])

ws.write(data)

Write a data, data can be a string or a Buffer, and return a Boolean value.

If true is returned, it means that the write channel is not full, and the next data can be written directly. The write channel is the size indicated by highWaterMark in the configuration.

If false is returned, it means that the writing channel is full, and the remaining characters start to wait, causing back pressure.

const ws = fs.createWriteStream(filename, {
    encoding: 'utf-8',
    highWaterMark: 2
})

const flag = ws.write('刘');
console.log(flag); // false 
这里虽然只会执行一次,但是在通道有空余空间的时候就会继续写入,并不在返回 值。

ws.write() 只会返回一次值。


const flag = ws.write('a');
console.log(flag);
const flag1 = ws.write('a');
console.log(flag1);
const flag2 = ws.write('a');
console.log(flag2);
const flag3 = ws.write('a');
console.log(flag3);

输出顺序:true、false、false、false

第二次写入的时候已经占了两字节,第三次写入后直接占满了,所以返回false

Use streams to copy and paste files and solve back pressure problems

const filename = path.resolve(__dirname, './file/write.txt');
const wsfilename = path.resolve(__dirname, './file/writecopy.txt');

const ws = fs.createWriteStream(wsfilename);
const rs = fs.createReadStream(filename)

rs.on('data', chumk => {
    const falg = ws.write(chumk);
    if(!falg) {
        rs.pause();
    }
})

ws.on('drain', () => {
    rs.resume();
})

rs.on('close', () => {
    ws.end();
    console.log('copy end');
})

pipe

Using pipe, you can also directly read and write streams Streaming in series can also solve the back pressure problem

rs.pipe(ws);

rs.on('close', () => {
    ws.end();
    console.log('copy end');
})

After learning, I feel that file streaming is very convenient when reading and writing a large number of files, and it can be done quickly and efficiently. Compared with writeFile and readFile are much more efficient, and there will be no major blocking if handled correctly.

For more node-related knowledge, please visit: nodejs tutorial! !

The above is the detailed content of An in-depth analysis of file flow in node.js. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:juejin.cn. If there is any infringement, please contact admin@php.cn delete