Hi there,
I've created a new collection of articles called - ♻️ Knowledge seeks community 🫶 - and it will be dedicated to sharing knowledge through the problem-solution-perfomance comparison approach.
🚀 Get articles like this delivered straight to your inbox. Subscribe to Stef’s Dev Notes.
Today’s topic is JS related: Removing Duplicates from an Array of Objects.
Introduction
In JavaScript, removing duplicates from an array of primitive values (numbers or strings) can be done easily using Set
. However, when dealing with arrays of objects, things get tricky because objects are reference types.
This article explores multiple ways to efficiently remove duplicates from an array of objects, discussing their trade-offs and best use cases.
🚩 Problem
Let’s see the following example:
Why does it fail?
The issue with this solution is that Set
only removes duplicates for primitive values (like numbers or strings), but the given list contains objects. When you pass an array of objects to new Set()
, it does not remove duplicates because objects are reference types, and each object has a unique reference.
💡 Solutions
We will go through some use cases in order to explore different solutions.
Solution #1: Removing Duplicates by a Unique Key (id
)
If objects have a unique identifier, we can use Map
to remove duplicates efficiently.
This ensures that objects with duplicate id values are removed while keeping only the first occurrence.
Solution #2: Removing Duplicates Based on All Object Properties
Using
Set()
+JSON.stringify()
This solution is order-sensitive, meaning that all the objects should have the properties in the same order.
Solution #3: Handling Objects with Different Property Orders
- Using
Set()
+JSON.stringify()
as a sorted JSON string
Sorting object keys before stringifying ensures identical objects are treated the same.
Solution #4: Deep Comparison with Lodash (_.isEqual
)
- For complex objects, Lodash’s
_.isEqual()
performs a deep comparison.
Performance Comparison
Time complexity explanation - what are those values?
O(n)
- Linear Time Complexity: The execution time grows linearly with the number of elements (n) in the array
Example: Iterating through an array once.O(n²)
- Quadratic Time Complexity: The execution time grows quadratically with the number of elements (n)
Example: A nested loop where every element is compared to every other element.
🗒️ If you want to know more about time and space complexity in Javascript, check this resource.
🏆 Challenge: Handling Arrays Inside Objects
Now, I’m challenging you to do a small exercise by solving the following scenario:
P.S. To ensure your solution meets the required time and space complexity, I recommend testing it with a larger dataset and observing its performance.
P.P.S. In addition to the solutions discussed in the article, I suggest experimenting with the structuredClone()
algorithm.
Conclusion
Choosing the right approach depends on your dataset size and structure.
- For large datasets, you should use
Map
orSet
-based solutions for efficiency. - For deeply nested objects, you can go with Lodash
_.isEqual()
,but expect performance trade-offs, or withstructuredClone()
.
💬 Let’s talk:
Have you encountered this issue in a real project? Which solution worked best for you? Let me know in the comments!
Until next time 👋,
Stefania
👋 Get in touch
You can find me on LinkedIn where I share updates about what’s next on Stef’s Dev Notes, as well as information about various tech events and other topics.
You can learn more about Frontend Development, JavaScript, Typescript, React, Next and Node.js in my newsletter: Stef’s Dev Notes.
Top comments (0)