LeetCode Meditations — Chapter 1: Arrays & Hashing
Before starting the Arrays & Hashing section in the Blind 75 list, let's very briefly get to know our prerequisite topics for now:
- dynamic arrays
- hash tables
- prefix sums
Dynamic Arrays
Dynamic arrays are, well, dynamic. They're flexible, and can change their size during execution.
Python's list
type is a dynamic array.
We can create an items
list, for example:
items = [3, 5]
items = [3, 5]
The length of items
is obviously 2, but its capacity is greater than or equal to its length. In fact, capacity refers to the total size, whereas length is the actual size.
Since dynamic arrays are still arrays, they need a contiguous block of memory.
We can easily add an item to items
:
items.append(7)
items.append(7)
And add some more:
items.append(9)
items.append(11)
items.append(13)
items.append(9)
items.append(11)
items.append(13)
All the while, the length and capacity of items
keeps growing dynamically.
Time and space complexity
Accessing an element is as we have random access.
Inserting a new element or deleting an element is (think about having to shift all the elements before inserting or after deleting an item). But, in order to not be too pessimistic, we can look at amortized analysis, in that case, inserting/deleting at the end of the array becomes .
Space complexity is , because of the excess space.
Hash Tables
A hash table maps keys to values, implementing an associative array.
Python's dict
is one example:
number_of_petals = {
'Euphorbia': 2,
'Trillium': 3,
'Columbine': 5,
}
number_of_petals = {
'Euphorbia': 2,
'Trillium': 3,
'Columbine': 5,
}
Also JavaScript's "object"s:
let numberOfMoons = {
'Earth': 1,
'Mars': 2,
'Jupiter': 95,
'Saturn': 146,
'Uranus': 27,
'Neptune': 14,
};
let numberOfMoons = {
'Earth': 1,
'Mars': 2,
'Jupiter': 95,
'Saturn': 146,
'Uranus': 27,
'Neptune': 14,
};
There are two important ingredients for a hash table:
- an array of "buckets" to store the data
- a hash function to map the data to a specific index in the array
Hashes are usually large integers, so to find an index, we can take the result of the hash modulo the array's length.
Here, with the hash value of each item's key, we calculate the remainder when it's divided by the length of the array to find which "bucket" it should go to.
The ratio of the number of elements to the number of buckets is called the load factor, and the higher it gets, the more collisions (when elements have to be inserted at the same place in the array) occur.
There are some collusion resolution tactics like linear probing (probing through the array until finding an empty bucket) and chaining (chaining multiple elements as linked lists), but we'll not go into those for now.
Time and Space Complexity
The average case for searching, inserting, and deleting operations are as we use keys to look up the values.
Space complexity is as it grows linearly with the amount of elements.
Prefix Sums
A prefix sum is the sequence of numbers we get after adding the sums of running totals of another sequence. It's also called the cumulative sum.
The first element of the resulting array is the first element of the input array. That's fine. We start at the second item, and add the previous numbers each time as we go. That is:
In code, we can implement that easily:
def runningSum(nums):
result = [nums[0]]
for i in range(1, len(nums)):
result.append(result[i - 1] + nums[i])
return result
def runningSum(nums):
result = [nums[0]]
for i in range(1, len(nums)):
result.append(result[i - 1] + nums[i])
return result
Time and space complexity
Time complexity for a prefix sum is because we're iterating over each of the elements in the array. The space complexity is also because the need of space grows as the length of the original array grows.
And, we're done with the introduction to the first chapter, now it's time to take a breath and notice your surroundings. Maybe it's raining, or a bird sings nearby, or there's just the silence of the night. Or neither of them, that's all fine.
The first problem to look at will be Contains Duplicate, so until then, happy coding.