


Buffer module
js was originally designed for browsers, so it can handle unicode-encoded strings well, but it cannot handle binary data well. This is a problem with Node.js because Node.js is designed to send and receive data over the network, often in binary format. For example:
- Send and receive data via TCP connection;
- Read binary data from images or compressed files;
- Read and write data from the file system;
- Process binary data streams from the network
The Buffer module brings a method of storing raw data to Node.js, so binary data can be used in the context of js. Whenever you need to handle data moved in I/O operations in Node.js, it is possible to use the Buffer module.
Class: Buffer
The Buffer class is a global variable type used to directly process binary data. It can be constructed in a variety of ways.
The original data is saved in an instance of the Buffer class. A Buffer instance is similar to an integer array
1.new Buffer(size): Allocate a new buffer whose size is 8-bit bytes of size.
2.new Buffer(array): Allocate a new buffer using an 8-bit byte array.
3.new Buffer(str, [encoding]):encoding String type - what encoding method to use, the parameters are optional.
4. Class method: Buffer.isEncoding(encoding): If the given encoding encoding is valid, return true, otherwise return false.
5. Class method: Buffer.isBuffer(obj): Test whether this obj is a Buffer. Return Boolean
6. Class method: Buffer.concat(list, [totalLength]): list {Array} array type, Buffer array, used to be connected. totalLength {Number} type The total size of all Buffers in the above Buffer array.
In addition to reading the file to obtain an instance of the Buffer, it can also be constructed directly, for example:
var bin = new Buffer([ 0x48, 0x65, 0x6c, 0x6c, 0x6c ]);
Buffer is similar to a string. In addition to using the .length attribute to get the byte length, you can also use the [index] method to read the bytes at the specified position, for example:
bin[0]; // => 0x48;
Buffers and strings can be converted to each other. For example, binary data can be converted into strings using the specified encoding:
var str = bin.toString('utf-8'); // => "hello"
The .slice method does not return a new Buffer, but more like returning a pointer to a position in the middle of the original Buffer, as shown below.
1.[ 0x48, 0x65, 0x6c, 0x6c, 0x6c ]
2. ^ ^ ^
3. | |
4. bin bin.slice(2)
Write buffer
var buffer = new Buffer(8);//Create a buffer allocated 8 bytes of memory
console.log(buffer.write('a','utf8'));//Output 1
This will write the character "a" into the buffer, and node returns the number of bytes written to the buffer after encoding. The UTF-8 encoding of the letter a here occupies 1 byte.
Copy buffer
Node.js provides a method to copy the entire contents of a Buffer object to another Buffer object. We can only copy between existing Buffer objects, so they must be created.
buffer.copy(bufferToCopyTo)
Among them, bufferToCopyTo is the target Buffer object to be copied. Example below:
var buffer1 = new Buffer(8);
buffer1.write('nice to meet u','utf8');
var buffer2 = new Buffer(8);
buffer1.copy(buffer2);
console.log(buffer2.toString());//nice to meet u
Stream module
In UNIX-type operating systems, streams are a standard concept. There are three main streams as follows:
1.Standard input
2.Standard output
3.Standard error
Readable stream
If buffers are how Node.js handles raw data, then streams are usually how Node.js moves data. Streams in Node.js are either readable or writable. Many modules in Node.js use streams, including HTTP and the file system.
Suppose we create a classesmates.txt file and read a list of names from it in order to use this data. Since the data is a stream, this means that you can act on the data starting from the first few bytes before you finish reading the file. This is a common pattern in Node.js:
var fs = require('fs');
var stream = fs.ReadStream('classmates.txt');
stream.setEncoding('utf8');
stream.on('data', function (chunk) {
console.log('read some data')
});
stream.on('close', function () {
console.log('all the data is read')
});
In the above example, the event data is triggered when new data is received. The close event is triggered when the file reading is completed.
Writable stream
Obviously, we can also create writable streams to write data to. This means that with a simple script, you can use a stream to read into a file and then write to another file:
var fs = require('fs');
var readableStream = fs.ReadStream('classmates.txt');
var writableStream = fs.writeStream('names.txt');
readableStream.setEncoding('utf8');
readableStream.on('data', function (chunk) {
writableStream.write(chunk);
});
readableStream.on('close', function () {
writableStream.end();
});
When a data event is received, data is now written to the writable stream.
readable.setEncoding(encoding): return: this
readable.resume(): Same as above. This method allows the readable stream to continue firing data events.
readable.pause(): Same as above. This method causes a stream in flowing mode to stop firing data events, switch to non-flowing mode, and leave subsequent available data in the internal buffer.
Class: stream.Writable
The Writable stream interface is an abstraction of the data you are writing to a target.
1.writable.write(chunk, [encoding], [callback]):
chunk {String | Buffer} Data to be written
encoding {String} encoding, if chunk is a string
callback {Function} callback after data block is written
Returns: {Boolean} true if the data has been fully processed.
This method writes data to the underlying system and calls the given callback after the data is processed.
2.writable.cork(): Force all writes to stay.
The retained data will be written when .uncork() or .end() is called.
3.writable.end([chunk], [encoding], [callback])
chunk {String | Buffer} optional, data to be written
encoding {String} encoding, if chunk is a string
callback {Function} optional, callback after the stream ends
Calling write() after calling end() will generate an error.
// Write 'hello, ' and end with 'world!'
http.createServer(function (req, res) {
res.write('hello, ');
res.end('world!');
// No further writing is allowed now
});

Vercel是什么?本篇文章带大家了解一下Vercel,并介绍一下在Vercel中部署 Node 服务的方法,希望对大家有所帮助!

gm是基于node.js的图片处理插件,它封装了图片处理工具GraphicsMagick(GM)和ImageMagick(IM),可使用spawn的方式调用。gm插件不是node默认安装的,需执行“npm install gm -S”进行安装才可使用。

本篇文章带大家详解package.json和package-lock.json文件,希望对大家有所帮助!

如何用pkg打包nodejs可执行文件?下面本篇文章给大家介绍一下使用pkg将Node.js项目打包为可执行文件的方法,希望对大家有所帮助!

本篇文章给大家分享一个Nodejs web框架:Fastify,简单介绍一下Fastify支持的特性、Fastify支持的插件以及Fastify的使用方法,希望对大家有所帮助!

node怎么爬取数据?下面本篇文章给大家分享一个node爬虫实例,聊聊利用node抓取小说章节的方法,希望对大家有所帮助!

本篇文章给大家分享一个Node实战,介绍一下使用Node.js和adb怎么开发一个手机备份小工具,希望对大家有所帮助!

先介绍node.js的安装,再介绍使用node.js构建一个简单的web服务器,最后通过一个简单的示例,演示网页与服务器之间的数据交互的实现。


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool

MantisBT
Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

SublimeText3 Linux new version
SublimeText3 Linux latest version

SecLists
SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function
