It’s me again, welcome back! I can see you’re hungry for some more concepts on algorithms!
In this lesson, we’re going to take a look at the so-called « complexity of algorithms ».
An introduction to complexity of algorithms
However, to fully understand the concept of complexity, we need to define two things more precisely. Ready?
Sometimes, adding or multiplying numbers can be very costly in terms of time. This can be true when adding or multiplying numbers that are very large integers, such as the ones used in certain cryptographic algorithms. So, in other words, defining elementary operations depends on the task.
The second thing we should understand is that size of the input depends on the context of the algorithm.
And why? Because knowing this asymptotical behavioral is often sufficient to decide if an algorithm is fast enough, and it’s often useless to have more accurate estimations.
The asymptotical behavior expressed by the big-O notation makes us drop constants and low-order terms. This is because when the problem size gets large enough, those terms don’t matter anymore.
Note that two algorithms can have the same big-O time complexity, even though one is always faster than the other.
Complexity of the algorithms that we have seen so far
Let’s put this into context with the algorithms we’ve seen so far. Both the DFS and BFS examine each vertex of a graph once, and for each vertex, they consider all of its neighbors.
This ends today’s lesson on the complexity of algorithms. As you might have guessed, complexity is an important tool to evaluate the performance of an algorithm before starting to implement it.
For example, imagine that you want to implement an artificial intelligence in a game. As time complexity is directly linked to calculation time, and calculation time directly relates to the response time of your artificial intelligence, you can see how important this matter is if you wish to beat your opponents.
See you next week, where you’ll learn about a very important problem in computer science : the traveling salesman problem, or TSP. Unfortunately, we’ll also see that the algorithms to solve it, are very complex. But luckily, this does not mean that they are complicated to understand.
Remarks on complexity of algorithms
It is often tempting to compare the asymptotic behaviors of the complexities of algorithms to draw conclusions. That’s why they are most often used, but sometimes the devil is in the details.
The complexity of an algorithm is defined as the number of elementary operations required for the worst possible case. In practice, the algorithm can be fast in almost all cases. An example of a famous algorithm with this property is the simplex algorithm.
Let us take two functions and of . We can quite easily have and yet for all . In practice, some algorithms may have a more favorable asymptotic behavior but are much slower in practice. A famous example is the primality test.
The importance of constants
Writings of asymptotic behavior ignore constants. In particular, we can write . However, an algorithm whose complexity is exactly is times slower than an algorithm whose complexity is exactly .