Home >Web Front-end >JS Tutorial >How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

青灯夜游
青灯夜游forward
2022-01-17 19:16:383312browse

Based on nodeHow to implement http transmission of large files? The following article will introduce to you several practical http file transfer solutions based on nodejs. I hope it will be helpful to you!

How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

The http file transfer solution based on nodejs plays an important role in the current front-end and back-end full-stack development. In this article, I will go through several A solution to implement HTTP transfer of large files. Before implementing the function, we first write a large file through the fs module of nodejs and generate a local file in the project:

const fs = require('fs');
const writeStream = fs.createWriteStream(__dirname + "/file.txt");
for(let i = 0;i <= 100000; i++) {
  writeStream.write(`${i} —— 我是${i}号文件\n`, "utf-8");
}
writeStream.end();

How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

After the above code runs successfully, A text file with a size of 3.2MB will be generated in the current execution directory, which will be used as the "large file material" for the following program. Before listing the large file transfer scheme, we first encapsulate the two public methods that will be used later: File reading method and File compression method:

// 封装读取文件的方法
const readFile = async (paramsData) => {
  return new Promise((resolve, reject) => {
    fs.readFile(paramsData, (err, data) => {
      if(err) {
        reject(&#39;文件读取错误&#39;);
      } else {
        resolve(data);
      }
    })
  })
}

// 封装文件压缩方法
const gzip = async (paramsData) => {
  return new Promise((resolve, reject) => {
    zlib.gzip(paramsData, (err, result) => {
      if(err) {
        reject(&#39;文件压缩错误&#39;);
      } else {
        resolve(result);
      }
    })
  })
}

1. Transmit through large files after data compression

When the browser sends a request, it will carry accept and accept- * Request header information, used to tell the server the file types supported by the current browser, the supported compression format list and the supported languages. The Accept-Encoding field in the request header is used to tell the server the content encoding method (usually a certain compression algorithm) that the client can understand. The server will choose a method supported by the client and notify the client of the choice through the response header Content-Encoding. The response header tells the browser that the JS script returned is passed gzip Compression algorithm processed

// 请求头
accept-encoding: gzip, deflate, br
// 响应头
cache-control: max-age=2592000 
content-encoding: gzip 
content-type: application/x-javascript

Based on the understanding of Accept-Encoding and Content-Encoding fields, let’s verify that it is not turned on gzip And the effect of turning on gzip.

// 实现一个简单的文件读取服务器(没有开启gzip)
const server = http.createServer(async (req, res) => {
  res.writeHead(200, {
    "Content-Type": "text/plain;charset=utf-8",
  });
  const buffer = await readFile(__dirname + &#39;/file.txt&#39;);
  res.write(buffer);
  res.end();
})
server.listen(3000, () => {
  console.log(`server启动成功`)
})

How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

// 实现一个简单的文件读取服务器(开启gzip)
const server = http.createServer(async(req, res) => {
  res.writeHead(200, {
    "Content-Type": "text/plain;charset=utf-8",
    "Content-Encoding": "gzip"
  });
  const buffer = await readFile(__dirname + &#39;/file.txt&#39;);
  const gzipData = await gzip(buffer);
  res.write(gzipData);
  res.end();
})
server.listen(3000, () => {
  console.log(`server启动成功`)
})

How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods)

2. Transmission through data chunking

When there is a scenario where a large HTML table needs to be generated using the data obtained from the database query, or when a large number of images need to be transmitted, this can be achieved through block transmission.

Transfer-Encoding: chunked
Transfer-Encoding: gzip, chunked

The value of the Transfer-Encoding field in the response header is chunked, which indicates that the data is sent in a series of chunks. It should be noted that the two fields Transfer-Encoding and Content-Length are mutually exclusive, which means that these two fields cannot appear at the same time in the response message.

// 数据分块传输
const spilitChunks = async () =>{
  const buffer = await readFile(__dirname + &#39;/file.txt&#39;);
  const lines = buffer.toString(&#39;utf-8&#39;).split(&#39;\n&#39;);
  let [chunks, i, n] = [[], 0, lines.length];
  while(i < n) {
    chunks.push(lines.slice(i, i+= 10));
  };
  return chunks;
}
const server = http.createServer(async(req, res) => {
  res.writeHead(200, {
    "Content-Type": "text/plain;charset=utf-8",
    "Transfer-Encoding": "chunked",
    "Access-Control-Allow-Origin": "*",
  });
  const chunks = await spilitChunks();
  for(let i =0; i< chunks.length; i++) {
    setTimeout(() => {
      let content = chunks[i].join("&");
      res.write(`${content.length.toString(16)}\r\n${content}\r\n`);
    }, i * 1000);
  }
  setTimeout(() => {
    res.end();
  }, chunks.length * 1000);
})
server.listen(3000, () => {
  console.log(`server启动成功`)
})

3. Transmit through data stream

When using Node.js Return large files to the client When using a stream to return a file stream, you can avoid taking up too much memory when processing large files. The specific implementation is as follows. When using stream form to return file data, the value of the HTTP response header Transfer-Encoding field is chunked, indicating that the data is sent in a series of chunks.

const server = http.createServer((req, res) => {
  res.writeHead(200, {
    "Content-Type": "text/plain;charset=utf-8",
    "Content-Encoding": "gzip",
    "Transfer-Encoding": "chunked"
  });
  fs.createReadStream(__dirname + "/file.txt")
    .setEncoding("utf-8")
    .pipe(zlib.createGzip())
    .pipe(res);
})

server.listen(3000, () => {
  console.log(`server启动成功`)
})

For more node-related knowledge, please visit: nodejs tutorial! !

The above is the detailed content of How to implement HTTP transfer of large files based on nodejs? (Sharing of practical methods). For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:juejin.cn. If there is any infringement, please contact admin@php.cn delete