首页 >web前端 >js教程 >Mastering Node.js Streams and Pipelines

Mastering Node.js Streams and Pipelines

DDD
DDD原创
2024-09-19 06:15:331112浏览

Mastering Node.js Streams and Pipelines

Streams are Node.js's superpower for handling large datasets efficiently. Let's dive into streams and pipelines.

Why Streams?

  1. Memory efficiency
  2. Time efficiency
  3. Composability

Types of Streams

  1. Readable
  2. Writable
  3. Duplex
  4. Transform

Basic Stream Usage

const fs = require('fs');

const readStream = fs.createReadStream('big.file');
const writeStream = fs.createWriteStream('output.file');

readStream.on('data', (chunk) => {
  writeStream.write(chunk);
});

readStream.on('end', () => {
  writeStream.end();
});

Enter Pipelines

Pipelines simplify stream composition and error handling.

const { pipeline } = require('stream/promises');
const fs = require('fs');
const zlib = require('zlib');

async function compressFile(input, output) {
  await pipeline(
    fs.createReadStream(input),
    zlib.createGzip(),
    fs.createWriteStream(output)
  );
  console.log('Compression complete');
}

compressFile('big.file', 'big.file.gz').catch(console.error);

Custom Transform Streams

const { Transform } = require('stream');

const upperCaseTransform = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

pipeline(
  process.stdin,
  upperCaseTransform,
  process.stdout
).catch(console.error);

Performance Tips

  1. Use highWaterMark to control buffering
  2. Implement stream.Readable.from() for async iterables
  3. Leverage stream.finished() for cleanup

Common Pitfalls

  1. Ignoring backpressure
  2. Mishandling errors
  3. Neglecting to end writable streams

Streams shine with large datasets or real-time data processing. Master them for scalable Node.js applications.

cheers?

以上是Mastering Node.js Streams and Pipelines的详细内容。更多信息请关注PHP中文网其他相关文章!

声明:
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系admin@php.cn