Home > Article > Web Front-end > N ways to merge arrays in JavaScript_javascript skills
This is a simple article about some tips on using JavaScript arrays. We will use different methods to combine/merge two JS arrays, as well as discuss the advantages/disadvantages of each method.
Let us first consider the following situation:
concat(..)
This is the most common approach:
As you can see, C is a brand new array, representing the combination of two arrays a and b, leaving A and B unchanged. Simple right?
But what if a has 10,000 elements and b also has 10,000 elements? C will have 20,000 elements, so the memory usage of a and b will double.
“No problem!”, you say. Let them be garbage collected, set A and B to null, problem solved!
Loop insertion
Okay, let’s copy the contents of one array to another, using: Array#push(..)
Seems to have a better memory footprint.
But what if array a is smaller? For memory and speed reasons, you may want to put the smaller a in front of b. No problem, just replace push(..) with unshift(..):
Function Tips
However, the for loop is indeed ugly and difficult to maintain. Can we do better?
This is our first attempt, using Array#reduce:
a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
// or `a` into `b`:
b = a.reduceRight( function(coll,item){
coll.unshift(item);
Return coll;
}, b );
b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Array#reduce(..) and Array#reduceRight(..) are nice, but they are a little clunky. ES6=>'s arrow functions will reduce the amount of code some, but it still requires a function that needs to be called once for each element, which is not perfect.
How about this:
a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
// or `a` into `b`:
b.unshift.apply( b, a );
b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
This one is much better, right? Especially since the unshift(..) method doesn't need to worry about the previous reverse sorting here. ES6's span operation will be more beautiful: a.push( ...b) or b.unshift( ...a
Maximum array length limit
The first major issue is that the memory usage has doubled (temporarily of course!) and what is being appended is basically copying elements to the stack via function calls. In addition, different JS engines have limitations on the length of copied data.
So, if the array has a million elements, you will definitely exceed the limit of the call stack allowed for push(...) or unshift(...). Alas, it will do a fine job with a few thousand elements, but you have to be careful not to exceed reasonable length limits.
Note: You can try splice(...), which has the same problem as push(...) and unshift(...).
There is a way to avoid this maximum length limit.
Wait a minute, our readability is backwards. That's it, it may get worse as it changes, haha.