## 21. Complexity Ordering

We all know that $$O(N) < O(N^2)$$, but what’s the relation between $$O(\sqrt{N})$$ and $$O(\log_2(N))$$? Let’s determine an ordering for some common asymptotic complexities, and find the points where the smaller ones meet the larger ones.

The smallest complexity class is $$O(1)$$. Unsurprisingly, it’s the time complexity of CPU operations like arithmetic evaluations, jumps, and memory allocations. More interestingly, it’s also the time complexity of some graphic operations; for instance, rotating, scaling, or translating a 3D scene in OpenGL is just multiplying two 4x4 matrices.

The next complexity class is $$O(\log_2(N))$$. It’s usually the time complexity of divide-and-conquer-style algorithms, be they binary searches, or descents and ascents through balanced trees.

Next comes $$O(\sqrt{N})$$. This complexity rarely comes up in practice: the only times I’ve encountered it was this trick for speeding up some algorithms, and finding the discrete logarithm.

Then comes $$O(N)$$, which is usually the time of an iteration through a collection of $$N$$ elements, or an enumeration of $$N$$ things. Finding the minimum of a $$N$$ things is $$O(N)$$; interestingly, so is sorting them, provided you know something more about the things themselves.

Next is $$O(N \log_2(N))$$ which is the time complexity of most good sorting algorithms. By this point, complexities tend to be the result of combining other complexities; for instance, Heap Sort does $$O(N)$$ heap operations that each take $$O(\log_2(N))$$, which gives us $$O(N \log_2(N))$$.

Finally, we have $$O(N^2)$$, the minimal complexity of most algorithms dealing with matrices.

Putting it all together, we get:

$$O(1) < O(\log_2(N)) < O(\sqrt{N}) < O(N) < O(N \log_2(N)) < O(N^2)$$

As a final note, a very simple way of determining the order of two complexities is to input the two functions into Wolfram Alpha. For instance, to compare $$O(\sqrt{N})$$ and $$O(\log_2(N))$$, we input sqrt(x) = log(x)/log(2).