Home >Web Front-end >JS Tutorial >Detailed introduction to buffering and streaming modules in Node.js_node.js
Buffer module
js was originally designed for browsers, so it can handle unicode-encoded strings well, but it cannot handle binary data well. This is a problem with Node.js because Node.js is designed to send and receive data over the network, often in binary format. For example:
- Send and receive data via TCP connection;
- Read binary data from images or compressed files;
- Read and write data from the file system;
- Process binary data streams from the network
The Buffer module brings a method of storing raw data to Node.js, so binary data can be used in the context of js. Whenever you need to handle data moved in I/O operations in Node.js, it is possible to use the Buffer module.
Class: Buffer
The Buffer class is a global variable type used to directly process binary data. It can be constructed in a variety of ways.
The original data is saved in an instance of the Buffer class. A Buffer instance is similar to an integer array
1.new Buffer(size): Allocate a new buffer whose size is 8-bit bytes of size.
2.new Buffer(array): Allocate a new buffer using an 8-bit byte array.
3.new Buffer(str, [encoding]):encoding String type - what encoding method to use, the parameters are optional.
4. Class method: Buffer.isEncoding(encoding): If the given encoding encoding is valid, return true, otherwise return false.
5. Class method: Buffer.isBuffer(obj): Test whether this obj is a Buffer. Return Boolean
6. Class method: Buffer.concat(list, [totalLength]): list {Array} array type, Buffer array, used to be connected. totalLength {Number} type The total size of all Buffers in the above Buffer array.
In addition to reading the file to obtain an instance of the Buffer, it can also be constructed directly, for example:
Buffer is similar to a string. In addition to using the .length attribute to get the byte length, you can also use the [index] method to read the bytes at the specified position, for example:
Write buffer
Copy buffer
Node.js provides a method to copy the entire contents of a Buffer object to another Buffer object. We can only copy between existing Buffer objects, so they must be created.
Among them, bufferToCopyTo is the target Buffer object to be copied. Example below:
Stream module
In UNIX-type operating systems, streams are a standard concept. There are three main streams as follows:
1.Standard input
2.Standard output
3.Standard error
Readable stream
If buffers are how Node.js handles raw data, then streams are usually how Node.js moves data. Streams in Node.js are either readable or writable. Many modules in Node.js use streams, including HTTP and the file system.
Suppose we create a classesmates.txt file and read a list of names from it in order to use this data. Since the data is a stream, this means that you can act on the data starting from the first few bytes before you finish reading the file. This is a common pattern in Node.js:
In the above example, the event data is triggered when new data is received. The close event is triggered when the file reading is completed.
Writable stream
Obviously, we can also create writable streams to write data to. This means that with a simple script, you can use a stream to read into a file and then write to another file:
When a data event is received, data is now written to the writable stream.
readable.setEncoding(encoding): return: this
readable.resume(): Same as above. This method allows the readable stream to continue firing data events.
readable.pause(): Same as above. This method causes a stream in flowing mode to stop firing data events, switch to non-flowing mode, and leave subsequent available data in the internal buffer.
Class: stream.Writable
The Writable stream interface is an abstraction of the data you are writing to a target.
1.writable.write(chunk, [encoding], [callback]):
chunk {String | Buffer} Data to be written
encoding {String} encoding, if chunk is a string
callback {Function} callback after data block is written
Returns: {Boolean} true if the data has been fully processed.
This method writes data to the underlying system and calls the given callback after the data is processed.
2.writable.cork(): Force all writes to stay.
The retained data will be written when .uncork() or .end() is called.
3.writable.end([chunk], [encoding], [callback])
chunk {String | Buffer} optional, data to be written
encoding {String} encoding, if chunk is a string
callback {Function} optional, callback after the stream ends
Calling write() after calling end() will generate an error.