Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 9 additions & 4 deletions heaps/heap_sort.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@

from heaps.min_heap import MinHeap

def heap_sort(list):
""" This method uses a heap to sort an array.
Time Complexity: ?
Space Complexity: ?
Time Complexity: O(n log n) where n is the number of items in the list
Space Complexity: O(n) where n is the number of items in the list
Comment on lines +5 to +6

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✨ Great. Since sorting using a heap reduces down to building up a heap of n items one-by-one (each taking O(log n)), then pulling them back out again (again taking O(log n) for each of n items), we end up with a time complexity of O(2n log n) → O(n log n). While for the space, we do need to worry about the O(log n) space consume during each add and remove, but they aren't cumulative (each is consumed only during the call to add or remove). However, the internal store for the MinHeap does grow with the size of the input list. So the maximum space would be O(n + log n) → O(n), since n is a larger term than log n.

Note that a fully in-place solution (O(1) space complexity) would require both avoiding the recursive calls, as well as working directly with the originally provided list (no internal store).

"""
pass
heap = MinHeap()
for item in list:
heap.add(item)
for i in range(len(list)):
list[i] = heap.remove()
return list
Comment on lines +11 to +13

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note the since this isn't a fully in-place solution (the MinHeap has a O(n) internal store), we don't necessarily need to modify the passed in list. The tests are written to check the return value, so we could unpack the heap into a new result list to avoid mutating the input.

Also, in this situation, since we built the heap, we also "know" the number of items in the heap. So it's OK to iterate a fixed number of times. But if we were pulling things out of a heap more generally, we would want to make use of the empty helper as follows:

    result = []
    while not heap.empty():
        result.append(heap.remove())

    return result

51 changes: 38 additions & 13 deletions heaps/min_heap.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,18 +19,29 @@ def __init__(self):
def add(self, key, value = None):
""" This method adds a HeapNode instance to the heap
If value == None the new node's value should be set to key
Time Complexity: ?
Space Complexity: ?
Time Complexity: O(log n)
Space Complexity: O(log n) due to recursive stack calls
Comment on lines +22 to +23

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✨ Great. You're exactly right that it's due to the recursive call in heap_up that the space complexity is O(log n). If heap_up were implemented iteratively, this would only require O(1) space complexity since the stack size wouldn't depend on the heap depth.

"""
pass
if value is None:
value = key
self.store.append(HeapNode(key, value))
self.heap_up(len(self.store) - 1)

def remove(self):
""" This method removes and returns an element from the heap
maintaining the heap structure
Time Complexity: ?
Space Complexity: ?
Time Complexity: O(log n)
Space Complexity: O(log n) due to recursive stack calls
Comment on lines +33 to +34

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✨ Nice. Just as for add, you're right that the log space complexity remove is due to the recursive heap_down implementation. We could achieve O(1) space complexity if we used an iterative approach.

"""
pass
if self.empty():

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✨ Nice use of your own helper method!

return None
last_index = len(self.store) - 1
root_index = 0
self.swap(last_index, root_index)
removed = self.store.pop()
self.heap_down(root_index)
return removed.value




Expand All @@ -44,10 +55,10 @@ def __str__(self):

def empty(self):
""" This method returns true if the heap is empty
Time complexity: ?
Space complexity: ?
Time complexity: O(1)
Space complexity: O(1)
Comment on lines +58 to +59

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"""
pass
return len(self.store) == 0

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remember that an empty list is falsy

        return not self.store



def heap_up(self, index):
Expand All @@ -57,18 +68,32 @@ def heap_up(self, index):
property is reestablished.

This could be **very** helpful for the add method.
Time complexity: ?
Space complexity: ?
Time complexity: O(1)
Space complexity: O(1)
Comment on lines +71 to +72

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👀 This is where the O(log n) time and space complexity in add comes from. Since heap_up calls itself recursively, the worst case will be when the new value needs to be moved all the way up the heap, which will have a height of log n. So both the time and space complexity (due to the stack growth) are O(log n). If we implemented this instead with an iterative approach, the space complexity would be O(1).

"""
pass
parent_index = (index - 1) // 2
if parent_index >= 0 and self.store[parent_index].key > self.store[index].key:
self.swap(parent_index, index)
self.heap_up(parent_index)

def heap_down(self, index):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✨ Nice set of conditionals to narrow in on where to swap. Notice there's a little duplication since we call swap, heap_down in two places. We could try to fully determine which child we're going to swap with first, and then have a single code flow to swap and re-heapify.

Though not prompted, like heap_up, heap_down is also O(log n) in both time and space complexity. The worst case for re-heapifying is if the new root need to move back down to a leaf, and so the stack growth will be the height of the heap, which is log n. If we implemented this instead with an iterative approach, the space complexity would instead be O(1).

""" This helper method takes an index and
moves the corresponding element down the heap if it's
larger than either of its children and continues until
the heap property is reestablished.
"""
pass
left_index = (index * 2) + 1
right_index = (index * 2) + 2
if left_index < len(self.store) and right_index < len(self.store):
smallest = left_index
if self.store[left_index].key > self.store[right_index].key:
smallest = right_index
if self.store[smallest].key < self.store[index].key:
self.swap(smallest, index)
self.heap_down(smallest)
elif left_index < len(self.store) and self.store[left_index].key < self.store[index].key:
self.swap(left_index, index)
self.heap_down(left_index)


def swap(self, index_1, index_2):
Expand Down