Welcome back! Great to see you again!
Last week, we looked at how to implement the DFS and BFS using queuing structures. This week, to implement Dijkstra’s algorithm, we’re going to need a queuing structure that’s a bit more elaborate: one that’s going to take the weights in the graph into account.
Let’s take a look at another queuing structure to implement Dijkstra’s algorithm. Say hello to the min-heap.
With a min-heap, two elementary operations can be performed. The first one is add-or-replace. The add-or-replace operation is used to add a (key, value) couple to the min-heap. If the key already exists, and its associated value is larger than the one we want to add, then the value is updated with the new value. If the value is smaller, then the operation is ignored.
The second elementary operation of the min-heap is remove, which will return the (key, value) couple corresponding to the minimum value in the min-heap.
Got it? Now let’s see how we can use a min-heap to implement the Dijkstra algorithm.
Dijkstra using a min-heap
The goal of Dijkstra’s algorithm is to find shortest paths between a starting position and all other vertices in a weighted graph. To implement this algorithm using a min-heap, all we have to do is to add vertices as keys, and distances to the starting position as associated values.
def dijkstra(graph, start_vertex): # initialize min_heap = new min_heap add_or_replace(min_heap, start_vertex, 0) # algorithm loop while not(empty(min_heap)): v, distance = remove(min_heap) for each i neighbor of v: distance_through_v = distance + graph[v][i] add_or_replace(min_heap, i, distance_through_v)
So, that’s it for today, see you for the next video!
Remark (outside of transcripts): In this video, we have used the terms key (the thing we want to store) and value (its importance in the heap). Other terminologies exist! The only important aspect is that items in the min-heaps should be sorted according to elements of an ordered set.
Naive implementation of a priority queue using a list
A simple (yet quite inefficient) way to implement a priority queue is to use a list. This list stores pairs (key, value). Here, we show here how to create a min-heap using a list.
To create a priority queue using a list, simply create an empty list:
def priority_queue () : return 
To test if a priority queue is empty, simply compare it with the empty list:
def is_empty (queue) : return queue == priority_queue()
For insertion and recovery, a choice is available to us. We can either do the expensive work at the time of insertion, for example by maintaining our list sorted in ascending order of values:
def insert (queue, key, value) : for i in range(len(queue)) : key_i, value_i = queue[i] if value < value_i : return queue[:i] + [(key, value)] + queue[i:] return queue + [(key, value)]
Or do so at the time of recovery, looking for the smallest value. So we simply add the item to the list:
def insert (queue, key, value) : return queue + [(key, value)]
To retrieve the element with the smallest value in this priority queue (and the remaining priority queue), we find ourselves with two versions, depending on the scenario chosen for insertion. For version 1:
def extract (queue) : return queue, queue[1:]
For version 2:
def extract (queue) : min_i = 0 min_key, min_value = queue[min_i] for i in range(len(queue)) : key, value = queue[i] if value < min_value : min_key = key_i min_value = value_i min_i = i return (min_key, min_value), queue[:min_i] + queue[min_i+1:]
To test our functions, let’s execute the following commands:
# Structure preparation queue = priority_queue() queue = insert(queue, "five", 5) queue = insert(queue, "three", 3) queue = insert(queue, "eight", 8) # [('three', 3), ('five', 5), ('eight', 8)] if solution 1 is chosen # [('five', 5), ('three', 3), ('eight', 8)] if solution 2 is chosen print(queue) # "three", then "five", then "eight" while not empty(queue) : (key, value), queue = extract(queue) print(value) #  print(queue)
The problem with this implementation is the complexity of the operations. Depending on the version, either insertion or recovery requires browsing the entire list, leading to a number of elementary operations at least proportional to the size of the list (i.e., , with the number of items in the list). In practice this complexity is far too great.
Implementation using a balanced binary tree
Let us start with a few definitions. You should already know what a tree is, and what we call its root.
A vertex in a tree is a node if it has neighbors farther from the tree root (which we call children). If not, it is a leaf.
The depth of a tree is the length of the longest shortest path starting from the root.
A tree is balanced if all its shortest paths starting from the root have the same length, .
The tree model we will use
Instead of using a list, it is proposed to use a balanced binary tree with the following properties:
- Each node of the tree is associated with a couple (key, value). This property makes it possible to link the tree to the elements to be stored/extracted.
- Every parent node has two children, unless the total number of elements in the priority queue is even, in which case a single node has only one child and all the others have two.
Note: Ternary/quaternary/quaternary/etc. trees could also have been considered (this changes the number of children). As the number of children increases, the complexity of the insertion operation decreases, but the complexity of the extraction increases.
- The tree is balanced. This property ensures that the tree does not have a shape similar to a long chain, by making a kind of list, thus losing all the interest of using trees.
- A parent’s value is always smaller than that of his sons. This essential property ensures that the minimum value node is always at the root.
To initialize such a priority queue, simply return an empty tree.
To test if a priority queue is empty, check if it is equal to an empty tree.
To insert a couple (key, value) into the tree, proceed as follows:
- The couple (key, value) is added at the end of the tree, i.e.:
- If all parent nodes have two children, a leaf of minimum depth is transformed into a parent of this node.
- If a parent node has only one child, it is added as a second child this node.
- If the value of this new node is smaller than that of its parent, the couples (key, value) of the node and its parent are exchanged.
- We continue by going back up the tree as much as necessary.
Since we are only replacing elements in the tree with ones of smaller values, the resulting tree keeps the stated properties if the original one had them.
To retrieve an element, proceed as follows:
- The couple (key, value) associated with the root of the tree is returned.
- We replace it with one leaf of maximum depth of the tree.
- If the new root has a value larger than one of its children, the couple (key, value) is exchanged with the child with smallest value.
- We start again by going down as much as necessary.
Here again, it is easy to check that the resulting tree respects the stated properties if the starting tree also respects them.
The complexity of insertion and extraction operations requires to go from the root to a leaf of maximum depth in the worst case. The tree being binary, this represents operations, where is the number of elements in the queue. We are therefore much less complex than with a naive implementation using a list (where the number of operations was at least linear in ).
To go further
- A more complete list of priority structures.
- Priority structures help developing heuristics for visiting neighbors.
- Python’s heapq module: in practice, you should use this for PyRat.