Here is a simple article about some tips on using JavaScript arrays. We will use different methods to combine/merge two JS arrays, as well as discuss the advantages/disadvantages of each method.
Let's consider the following situation first:
var a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ];var b = [ "foo", "bar", "baz", "bam", "bun", "fun" ];
Obviously, the simplest combination result should be:
[ 1, 2, 3, 4, 5, 6, 7, 8, 9, "foo", "bar", "baz", "bam" "bun", "fun"]
concat(..)
This is the most common practice:
var c = a.concat( b );a; // [1,2,3,4,5,6,7,8,9]b; // ["foo","bar","baz","bam","bun","fun"]c; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
As you can see, C is a brand new array that represents a combination of two arrays a and b and keeps A and B invariant. Simple?
But if a has 10,000 elements and b has 10,000 elements? C will have 20,000 elements, so the memory usage of a and b will double.
"No problem!" you said. Let them be garbage collected, set A and B to null, the problem is solved!
a = b = null; // 'a' and 'b' are recycled
hehe. For small arrays with only a few elements, this is fine. But for large arrays, or systems with limited memory, this process needs to be repeated frequently, it actually has many improvements.
Loop insertion
OK, let's copy the contents of one array to another, using: Array#push(..)
// `b` onto `a`for (var i=0; i < b.length; i++) { a.push( b[i] );}a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]b = null;Now, array a has the contents of array b.
There seems to be a better memory footprint.
But what if the array a is smaller? For memory and speed reasons, you may want to put smaller a in front of b,. No problem, just change push(..) to unshift(..):
// `a` into `b`:for (var i=a.length-1; i >= 0; i--) { b.unshift( a[i] );}b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]Functional skills
However, the for loop is indeed ugly and difficult to maintain. Can we do better?
This is our first attempt, using Array#reduce:
// `b` onto `a`:a = b.reduce( function(coll,item){ coll.push( item ); return coll;}, a );a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]// or `a` into `b`:b = a.reduceRight( function(coll,item){ coll.unshift( item ); return coll;}, b );b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]Array#reduce(..) and Array#reduceRight(..) are nice, but they are a little bit clumsy. The arrow function of ES6=> will reduce the amount of code, but it still requires a function, and each element needs to be called once, which is not perfect.
So how about this:
// `b` onto `a`:a.push.apply( a, b );a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]// or `a` into `b`:b.unshift.apply( b, a );b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Is this a much better place? Especially because the unshift(..) method does not need to worry about the previous reverse sorting here. The ES6's speech operation will be more beautiful: a.push( …b ) or b.unshift( …a
Maximum length limit for array
The first major problem is that the memory usage doubles (only temporary of course!) is basically copying elements into the stack via function calls. In addition, different JS engines have limitations on the length of copy data.
So, if the array has a million elements, you'll definitely be out of the limit that push(…) or unshift(…) allows the call stack. Alas, it will do a great job of handling thousands of elements, but you have to be careful not to exceed the reasonable length limit.
Note: You can try splice(…), which has the same problem as push(…) and unshift(…).
There is a way to avoid this maximum length limit.
function combineInto(a,b) { var len = a.length; for (var i=0; i < len; i=i+5000) { b.unshift.apply( b, a.slice( i, i+5000 ) ); }}Wait, our readability has gone backwards. Just like that, it may get worse and worse.
The above is all about this article, I hope it will be helpful for everyone to learn JavaScript programming.