This article belongs to the basic skills of JavaScript. We will learn various common methods of combining/merging two JS arrays and compare the advantages and disadvantages of various methods.
Let's take a look at the specific scenario:
The code copy is as follows:
var q = [ 5, 5, 1, 9, 9, 6, 4, 5, 8];
var b = [ "tie", "mao", "csdn", "ren", "fu", "fei" ];
Obviously, the result of simple splicing of arrays q and b is:
The code copy is as follows:
[
5, 5, 1, 9, 9, 6, 4, 5, 8,
"tie", "mao", "csdn", "ren", "fu", "fei"
]
concat(..) method
The most common usages are as follows:
The code copy is as follows:
var c = q.concat( b );
q; // [5,5,1,9,9,6,4,5,8]
b; // ["tie","mao","csdn","ren","fu","fei"];
c; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
As you can see, c is a brand new array that represents a combination of two arrays: q and b, but q and b are useless now, right?
If the q array has 10,000 elements, and the b array has 10,000 elements? Then the array c now has 20,000 elements, which occupies twice the memory.
"This is OK!", you may think. Just put q and b in vain, and then it will be garbage collected, right? The problem is solved!
The code copy is as follows:
q = b = null; // `q` and `b` can now be garbage collected
Um? If the arrays are small, then there is naturally no problem. However, when large arrays or repeated processing is required, the memory is limited, and it also needs to be optimized.
Loop insertion
OK, let's try adding the contents of an array to another, using the Array#push() method:
The code copy is as follows:
// Insert array `b` into `q`
for (var i=0; i < b.length; i++) {
q.push( b[i] );
}
q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
b = null;
Now, the contents of two original arrays are stored in q (q + b).
It seems to have done a good job of memory optimization.
But what if the q array is very small and b is very large? For memory and speed considerations, you want to insert the smaller q into front of b. No problem, just use the unshift() method instead of push(), and the corresponding traversal must be performed from large to small:
The code copy is as follows:
// `q` into `b`:
for (var i=q.length-1; i >= 0; i--) {
b.unshift( q[i] );
}
b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
q = null;
Practical Tips
Sadly, the for loop is rustic and difficult to maintain. Can we do better?
Let's try Array#reduce first:
The code copy is as follows:
// `b` onto `q`:
q = b.reduce( function(coll,item){
coll.push( item );
return coll;
}, q );
q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
// or `q` into `b`:
b = q.reduceRight( function(coll,item){
coll.unshift( item );
return coll;
}, b );
b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
Array#reduce() and Array#reduceRight() are very high-end, but a bit bulky, and most people can't remember it. The => arrow functions (arrow-functions) in JS specification 6 can greatly reduce the amount of code, but it requires function calls to each array element, which is also a very bad method.
So how about the following code?
The code copy is as follows:
// `b` onto `q`:
q.push.apply( q, b );
q; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
// or `q` into `b`:
b.unshift.apply( b, q );
b; // [5,5,1,9,9,6,4,5,8,"tie","mao","csdn","ren","fu","fei"]
BIG is higher, right!? In particular, the unshift() method does not need to consider the opposite order as before. ES6's expansion operator (spread operator, prefix) is more advanced: a.push( ...b ) or b.unshift( ...a )
However, in fact, this method is still too optimistic. In both cases, whether it is to pass a or b to apply() as the second parameter (the first parameter becomes this internally when calling Function in the apply method, that is, context, scope), or the... expansion operator method, the array will actually be broken into the arguments of the function.
The first major problem is that it occupies double the memory (of course, temporary!) because the array needs to be copied into the function stack. In addition, different JS engines have different implementation algorithms, which may limit the number of parameters that the function can pass.
If an array adds a million elements, it will definitely exceed the size allowed by the function stack, whether it is a push() or an unshift() call. This method is only available when there are a few thousand elements, so it must be limited to exceeding a certain range.
Note: You can also try splice(), and you will definitely find that it has the same limitation as push(..)/unshift(..).
One option is to continue using this method, but with batch processing:
The code copy is as follows:
function combineInto(q,b) {
var len = q.length;
for (var i=0; i < len; i=i+5000) {
// 5,000 items are processed at a time
b.unshift.apply( b, q.slice( i, i+5000 ) );
}
}
Wait, we compromise the readability of the code (even performance!). End this journey before we give up.
Summarize
Array#concat() is a tried and tested method for combining two (or more) arrays. But he creates a new array instead of modifying the existing one.
There are many ways to adapt, but they all have different advantages and disadvantages and need to be chosen according to actual conditions.
The various advantages/disadvantages are listed above, and perhaps the best (including those not listed) methods are reduce(..) and reduceRight(..)
Whatever you choose, you should think critically about your combination and strategy rather than take it for granted.