Which is more efficient? (Both memory usage and speed.)
arrayOfArrays.reduce((acc, innerArray) => { acc.push(...innerArray); return acc; }, []);
Or
arrayOfArrays.reduce((acc, innerArray) => acc.concat(innerArray), []);
The first one is more efficient in a sense that you're just reusing the array, and not recreating one on each iteration like the other one. However, neither of them are "efficient" in a sense that you're still looping... and there's actually a better way.
array.concat
accepts multiple args. You can just use the spread operator to spread the arrays in the array as its args.
const arr = Array.prototype.concat.call([], ...arrayOfArrays)
Or the older way to do it is using function.apply
const arr = Array.prototype.concat.apply([], arrayOfArrays)
I'm being too formal and all. A shorter version of the spread looks like:
const arr = [].concat(...arrayOfArrays)
Choose your weapon:
const arrayOfArrays = [[1,2], [3, 4], [5, 6]] const arr1 = Array.prototype.concat.call([], ...arrayOfArrays) const arr2 = Array.prototype.concat.apply([], arrayOfArrays) const arr3 = [].concat(...arrayOfArrays) console.log(arr1) console.log(arr2) console.log(arr3)
concat.call
is consistently and significantly slower than the OP's original code. The other two examples seems to be more or less the same level of efficiency, but the last one using just concat
and a spread did sometimes execute a few milliseconds faster. Conclusion: do anything but the first example in this answer.\$\endgroup\$CommentedAug 11, 2017 at 13:37