I am following a course of Algorithms and Data Structures.
Today, my professor said the complexity of the following algorithm is 2^n
.
I waited till the lesson was over, approached him and told him I actually believed it was an O(n)
algorithm, and I did the computation to prove it, and wanted to show them to it, but he continued to say it was not, without giving me any convincing explanation.
The algorithm is recursive, and its recurrence relation is:
{ 1 if n=1 T(n) = { { 2T(n/2) otherwise
I computed it down to be a O(n)
, this way:
Let's expand T(n)
T(n) = 2 [2 * T(n/(2^2))] = 2^2 * T(n/(2^2)) = 2^2 * [2 * T(n/(2^3))] = 2^3 * T(n/(2^3)) = ... = 2^i * T(n/(2^i)).
We stop when the term inside the T is 1, that is:
n/(2^i) = 1 ==> n = 2^i ==> i = log n
After the substitution, we obtain
T(n) = 2^log n * T(1) = n * 1 = O(n).
Since this algorithm jumped out of a lesson on Merge Sort, I noted how Merge Sort, which notoriously is O(n log n)
has a complexity of 2T(n/2) + theta(n) (obviously major then 2T(n/2)), and I asked him why is it, that an algorithm with a lower complexity, gets a higher big-O. Because, at this point, it's counter intuitive for me. He replied, words for words, "If you think that is counter-intuitive, you have serious problem in your math."
My questions are:
- Is there any fallacy in my demonstration?
- Wouldn't the last situation be counter-intuitive?
T
? It is a recursive definition, yes, it is also an algorithm but recursive definitions are quite common in math.