search
HomeWeb Front-endJS TutorialChunk-Busters: Don't cross the Streams!

⚠️ If you have photosensitivity, you probably want to skip this.
See the static image below, those lights will start blinking real fast!

Chunk-Busters: Don’t cross the Streams!

How does the internet work?

Remember the title… we are talking about streams here.

I could talk about protocols, packets, ordering, acks, and nacks… but we are talking about streams here, and as you probably guessed right (I believe in you =D) with streams… it’s either binary or strings.

Yes, strings are zipped before being sent… but for what we usually care about in front and backend development… strings and binary.

In the following examples, I’ll be using JS streams.

While Node has its own legacy implementations, we have ways to deal with streams that are the same code, be it in the front or back.

Other languages have their way of dealing with streams, but as you’ll see… the actual code part of dealing with it was not that complicated (not to say there aren’t complex things happening).

The example problem

You have a frontend that has to consume data from multiple sources.

While you could access each source individually via its IP/port, you put them behind an API Gateway for ease of use and control.

The Repo

Check the repo at the link, there learn how to run it yourself so you can play with it.

https://github.com/Noriller/chunk-busters

The Video

Video version to follow along:

https://youtu.be/QucaOfFI0fM

v0 - the naive implementation

You have the sources, you fetch, wait, and render. Rinse and repeat.

await fetch1();
handleResult(1);
await fetch2();
handleResult(2);
...
await fetch9();
handleResult(9);

You might be thinking that no one will actually do that…

In this example, it's clear that something is wrong, but it’s not that hard to fall into this.

The obvious: it’s slow. You have to fire and wait for each request and if it’s slow… you have to wait.

v1 - the eager version

You know you don’t want to wait for each request individually… so you fire all and then wait for them to complete.

await Promise.all([
  fetch1(),
  fetch2(),
  ...
  fetch9(),
]);
handleAllResults(results);

This is what you probably do, so it’s good right?

I mean, except if you have ONE single request being slow… this would mean that even if all the others are already done… you still would have to wait that one to complete.

v2 - the smarter, eager, version

You know you might have some requests that are slower, so you still fire all and wait, but as they come you already do something with the result when possible, so when the last one arrives, the others are already done.

await fetch1();
handleResult(1);
await fetch2();
handleResult(2);
...
await fetch9();
handleResult(9);

This HAS to be the best solution right?

Hmm… something weird?

v3 - I was lying to you… this is what v1 should look like

Remember v1? Yeah… this is what it should look like:

Turns out there’s a limit to how many connections you can have with the exact same endpoint in http/1 and not only that… it’s browser dependant and each browser might have different limits.

You might think to just use http/2 and call it a day… but even if this was a good solution, you still have to deal with multiple endpoints in the frontend.

Is there even a good solution for this?

v4 - enter streams!

Let’s revisit v0 but using streams…

You’re smart, so you were probably expecting this since the warning spoiled it a little… but yeah… what you were seeing before was not all the data that the backend was generating.

Anyway… as we fetch we render.

await Promise.all([
  fetch1(),
  fetch2(),
  ...
  fetch9(),
]);
handleAllResults(results);

If we instead tap on the stream coming, we can do something with the chunks of data as it comes. (Yes! Like Chat GPT and the like do.)

Even if v0 is the worst way of handling this problem, it’s greatly improved by using streams. You can trick the user by showing something, anything, even if the total wait time is the same.

v5 - v1, again, but with streams!

The http/1 problem is still an issue, but again, you can already see things as they come.

Yeah… I can’t stall this anymore… so…

v6 - one API to rule them all!

Or… maybe I can?

You see, the frontend had to manage too much… if we can offload that to the backend, then you can have one endpoint that will handle all the sources.

This solves complexity on the frontend and http/1 issues.

await Promise.all([
  fetch1().then(handleResult),
  fetch2().then(handleResult),
  ...
  fetch9().then(handleResult),
]);


// usually we do this:
await fetch(...).then((res) => {
  // this json call accumulate all the response
  // that later is returned for you to use
  return res.json()
})

v7 - and finally… one API, multiple sources, and streaming.

We call one API, that will call all the sources, stream the data, handle it, and pass it to the front that will, in turn, render the data as it comes.

The code used for this is basically the same on both front and back:

await fetchAll();
handleAllResults(results);

Yes… that's it (as the most basic and easy example goes).

We add the string coming to a buffer, parse it, check if there’s a usable chunk, use it, and forget it. This means you could receive/consume TBs of data… one chunk at a time, with little RAM.

I know what you’re thinking… and it’s stupid… it’s also madness…

MOOOOOOOM I want Websockets!

No sweetheart, we have websockets at home!

Websockets at home: next?

v8 - it’s only stupid if it doesn’t work

You’re smart, you thought that if the source is still generating data… then maybe we could update some variables…

This way you can keep the one connection being used to get more data or change something from what it’s generating.

Yes… I guess you could do that… and I did the example on your insistence. =D

Still… it’s a stupid idea, I don’t know where/if it can be used in a real production environment. Maybe if you travel back in time to that awkward JS phase between MPA and Ajax where you had enough interactivity, but not enough connections to the same server (some browsers had a limit of only 2!) then maybe?

Aside from that, no idea. If you do have… let me know.

In the example above, attention to the center board, especially the "progress border": you can see that one keeps updating. If you opened the network tab you would see that the GET connection is never closed before the end. You would also see multiple other requests that change what that one, still alive, connection was doing… all of that with vanilla http/1.

What’s next?

String vs JSON

This example is the most basic I could make. I’m even using simple strings instead of JSON since it’s easier to parse.

To use JSON, you have to accumulate the string (we do have to JSON.stringify the backend response for a reason).

Then either check where to break it and then parse that value or parse as you go.

For the first one, think NDJSON: instead of a JSON array, you separate the objects with new lines, then you could “more easily” find where to break, then JSON.parse each and use the object.

For the latter, you parse as you go as in: you know you’re in an array, now it’s an object, ok first key, now it’s the value of the key, next key, skip that, next key… and so on and on… it’s not something trivial to make manually, but it’s like the jump from the await then render to the render as you await this is all about… except… on an even smaller scale.

Error Handling

People like to host examples, this one you need to run yourself… I hope the reason for not hosting the example somewhere is clear now, but another one is that we don’t expect any error here, and if you were to add network errors above everything else… well…

Errors should be handled, but they do add another layer of complexity.

Should you be using it?

Maybe… you can say it depends

There are places where streaming is the answer but in most cases… await json is enough (not to mention easier).

But learning about streams opens up ways to solve some problems, be it in the frontend or the backend.

In the frontend, you can always use this to “trick” the user. Instead of showing spinners everywhere, you can show something as it comes and then show more as it comes, even if it takes a while. As long as you don’t block the user from interacting with it… you can even make something that is “slower” than just showing spinners feel like it’s way faster than anything actually faster.

In the backend, you can save up on RAM since you can just parse each chunk of data as it comes, be it from the front, a database, or anything else in between. Handle the data as needed, and send it without having to wait for the entire payload that would make it throw an OOM (Out of Memory) Error. GBs or even TBs of data… sure, why not?

Outro

Is React slow? This whole example frontend was done with React and aside from the “main” thing going on with all the blinking “lights”, there’s a lot of other stuff going on.

Yes… if you go fast enough the example can’t keep up and start freezing. But since it’s easily thousands of renderings going on per minute… I do think it’s enough for most applications.

And, you can always improve performance: for the “progress border”, I’ve used deferred values to make it smoother if you need to save some in renderings… I could done this and other performance enhancements for the “lights“ and the title, but it would just make the “lights” stop blinking a lot of the time (which wouldn’t make a nice demo), and also the “electric” underline in the title wouldn’t be as fun as it is.

In this example, all those “improvements” wouldn't be ideal, but for normal applications... you can make it handle a lot. And if you do need something more, then in this case use another solution.

Conclusion

Add streams to your arsenal… it might not be a panacea solution, but it surely will come in handy someday.

And if you’re gonna do something with it and want help, well… maybe give me a call. =P

The above is the detailed content of Chunk-Busters: Don't cross the Streams!. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Javascript Data Types : Is there any difference between Browser and NodeJs?Javascript Data Types : Is there any difference between Browser and NodeJs?May 14, 2025 am 12:15 AM

JavaScript core data types are consistent in browsers and Node.js, but are handled differently from the extra types. 1) The global object is window in the browser and global in Node.js. 2) Node.js' unique Buffer object, used to process binary data. 3) There are also differences in performance and time processing, and the code needs to be adjusted according to the environment.

JavaScript Comments: A Guide to Using // and /* */JavaScript Comments: A Guide to Using // and /* */May 13, 2025 pm 03:49 PM

JavaScriptusestwotypesofcomments:single-line(//)andmulti-line(//).1)Use//forquicknotesorsingle-lineexplanations.2)Use//forlongerexplanationsorcommentingoutblocksofcode.Commentsshouldexplainthe'why',notthe'what',andbeplacedabovetherelevantcodeforclari

Python vs. JavaScript: A Comparative Analysis for DevelopersPython vs. JavaScript: A Comparative Analysis for DevelopersMay 09, 2025 am 12:22 AM

The main difference between Python and JavaScript is the type system and application scenarios. 1. Python uses dynamic types, suitable for scientific computing and data analysis. 2. JavaScript adopts weak types and is widely used in front-end and full-stack development. The two have their own advantages in asynchronous programming and performance optimization, and should be decided according to project requirements when choosing.

Python vs. JavaScript: Choosing the Right Tool for the JobPython vs. JavaScript: Choosing the Right Tool for the JobMay 08, 2025 am 12:10 AM

Whether to choose Python or JavaScript depends on the project type: 1) Choose Python for data science and automation tasks; 2) Choose JavaScript for front-end and full-stack development. Python is favored for its powerful library in data processing and automation, while JavaScript is indispensable for its advantages in web interaction and full-stack development.

Python and JavaScript: Understanding the Strengths of EachPython and JavaScript: Understanding the Strengths of EachMay 06, 2025 am 12:15 AM

Python and JavaScript each have their own advantages, and the choice depends on project needs and personal preferences. 1. Python is easy to learn, with concise syntax, suitable for data science and back-end development, but has a slow execution speed. 2. JavaScript is everywhere in front-end development and has strong asynchronous programming capabilities. Node.js makes it suitable for full-stack development, but the syntax may be complex and error-prone.

JavaScript's Core: Is It Built on C or C  ?JavaScript's Core: Is It Built on C or C ?May 05, 2025 am 12:07 AM

JavaScriptisnotbuiltonCorC ;it'saninterpretedlanguagethatrunsonenginesoftenwritteninC .1)JavaScriptwasdesignedasalightweight,interpretedlanguageforwebbrowsers.2)EnginesevolvedfromsimpleinterpreterstoJITcompilers,typicallyinC ,improvingperformance.

JavaScript Applications: From Front-End to Back-EndJavaScript Applications: From Front-End to Back-EndMay 04, 2025 am 12:12 AM

JavaScript can be used for front-end and back-end development. The front-end enhances the user experience through DOM operations, and the back-end handles server tasks through Node.js. 1. Front-end example: Change the content of the web page text. 2. Backend example: Create a Node.js server.

Python vs. JavaScript: Which Language Should You Learn?Python vs. JavaScript: Which Language Should You Learn?May 03, 2025 am 12:10 AM

Choosing Python or JavaScript should be based on career development, learning curve and ecosystem: 1) Career development: Python is suitable for data science and back-end development, while JavaScript is suitable for front-end and full-stack development. 2) Learning curve: Python syntax is concise and suitable for beginners; JavaScript syntax is flexible. 3) Ecosystem: Python has rich scientific computing libraries, and JavaScript has a powerful front-end framework.

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

SAP NetWeaver Server Adapter for Eclipse

SAP NetWeaver Server Adapter for Eclipse

Integrate Eclipse with SAP NetWeaver application server.

MinGW - Minimalist GNU for Windows

MinGW - Minimalist GNU for Windows

This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

ZendStudio 13.5.1 Mac

ZendStudio 13.5.1 Mac

Powerful PHP integrated development environment

mPDF

mPDF

mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),