Let’s improve yesterday’s post
Yesterday’s post was about solving a classic interview question that requires you to find the intersection of two arrays (see post here). The answer was ok but can we improve the BigO complexity of the answer?
Yes, we can improve the time complexity of the sample function by using a hash map (also known as an object or dictionary in some languages). This will help us reduce the time complexity of searching for an element in the second array. Here’s the updated implementation in JavaScript:
function findIntersectionImproved(arr1, arr2) {
let intersection = [];
let map = {};
for (let i = 0; i < arr2.length; i++) {
map[arr2[i]] = true;
}
for (let i = 0; i < arr1.length; i++) {
if (map[arr1[i]]) {
intersection.push(arr1[i]);
delete map[arr1[i]];
}
}
return intersection;
}
In this improved implementation, we use the following approach:
- Create an empty
intersection
array to store the common elements between the two input arrays. - Create a
map
object to store the elements of the second array. - Iterate through
arr2
and add each element as a key to themap
object. - Iterate through
arr1
. If an element exists in themap
, add it to theintersection
array and remove it from themap
to prevent duplicates. - Return the
intersection
array containing the common elements between the two input arrays.
The time complexity analysis for this improved implementation is as follows:
- Iterating through
arr2
and creating themap
object takes O(m) time, where m is the length ofarr2
. - Iterating through
arr1
and checking for the existence of each element in themap
object takes O(n) time, where n is the length ofarr1
.
Thus, the overall time complexity of the improved function is O(n + m). This is an improvement over the previous implementation’s time complexity of O(n * m). By using a hash map, we’ve reduced the time complexity to linear growth based on the sizes of the input arrays, instead of having a multiplicative effect.