[Javascript] Improve performance of Array.reduce

发布时间 2023-04-07 20:06:53作者: Zhentiw

Compare two code snippet

const people = [
    {id: 1,name: 'John', age: 45},
    {id: 2,name: "Op", age: 32},
    {id: 3, name: "Wade",age: 39 }
]
// option 1
const res = people.reduce((acc, curr) => {
     return ({
        ...acc,
        [curr.id]: curr
    })
},{})
// option 2
const res2 = people.reduce((acc, curr) => {
    acc[curr.id] = curr
    return acc
},{})

 

Both options achieve the same result, which is converting an array of objects into an object with keys based on the 'id' property. However, there are some differences in performance and readability.

Option 1:

const res = people.reduce((acc, curr) => {
     return ({
        ...acc,
        [curr.id]: curr
    })
},{})

This option uses the spread operator to create a new object with the previous accumulator content and the new key-value pair. While this approach is easier to read and understand for some developers, it can be less efficient, as it creates a new object on each iteration.

 

Option 2:

const res2 = people.reduce((acc, curr) => {
    acc[curr.id] = curr
    return acc
},{})

This option modifies the accumulator object in-place by adding a new key-value pair on each iteration. This approach is generally more efficient, as it doesn't create a new object on each iteration.

Based on the trade-offs, I would recommend Option 2 for better performance, especially if you're working with large datasets. However, if readability is a higher priority and performance is not a major concern, you might prefer Option 1.

 

Now, let's see this code:

const numbers = [1,2,3,4,5]

numbers.reduce((acc, curr) => {
    return [
        ...acc,
        curr * 100
    ]
}, [])

You might think apply the same idea to improve the performance:

numbers.reduce((acc, curr) => {
    acc.push(curr * 100)
    return acc
}, [])

Nope, actually, you can just use .map:

const result = numbers.map(number => number * 100);