1

I am following a course of Algorithms and Data Structures.

Today, my professor said the complexity of the following algorithm is 2^n.

I waited till the lesson was over, approached him and told him I actually believed it was an O(n) algorithm, and I did the computation to prove it, and wanted to show them to it, but he continued to say it was not, without giving me any convincing explanation.

The algorithm is recursive, and its recurrence relation is:

 { 1 if n=1 T(n) = { { 2T(n/2) otherwise 

I computed it down to be a O(n), this way:

Let's expand T(n)

T(n) = 2 [2 * T(n/(2^2))] = 2^2 * T(n/(2^2)) = 2^2 * [2 * T(n/(2^3))] = 2^3 * T(n/(2^3)) = ... = 2^i * T(n/(2^i)). 

We stop when the term inside the T is 1, that is:

n/(2^i) = 1 ==> n = 2^i ==> i = log n

After the substitution, we obtain

T(n) = 2^log n * T(1) = n * 1 = O(n). 

Since this algorithm jumped out of a lesson on Merge Sort, I noted how Merge Sort, which notoriously is O(n log n) has a complexity of 2T(n/2) + theta(n) (obviously major then 2T(n/2)), and I asked him why is it, that an algorithm with a lower complexity, gets a higher big-O. Because, at this point, it's counter intuitive for me. He replied, words for words, "If you think that is counter-intuitive, you have serious problem in your math."

My questions are:

  1. Is there any fallacy in my demonstration?
  2. Wouldn't the last situation be counter-intuitive?
7
  • @pluminik: Can I merely suggest that you can't apply algebraic manipulation to the original algorithm (expressed in a pseudo-algebraic form) and expect to get the right answer? The original algorithm is (more or less) a representation of the actual machine instructions required to execute it, not an algebraic equation.CommentedOct 15, 2015 at 20:28
  • @Robert Harvey: I just read the text of the question.
    – Giorgio
    CommentedOct 15, 2015 at 20:28
  • 1
    Either the question is not formulated correctly or pluminik has misunderstood his professor's words.
    – Giorgio
    CommentedOct 15, 2015 at 20:29
  • Perhaps a bit of both. But the first section is clearly a representation of an executable algorithm.CommentedOct 15, 2015 at 20:30
  • You mean the definition of T? It is a recursive definition, yes, it is also an algorithm but recursive definitions are quite common in math.
    – Giorgio
    CommentedOct 15, 2015 at 20:31

3 Answers 3

3

First of all, to analyze the complexity of a simple, recursive algorithm, the first stop would be the Master Theorem, where we match your expression T(n) = 2·T(n/2) against the schema T(n) = a·T(n/b) + f(n). By comparing coefficients, we can see that a = 2, b = 2, f(n) = x for some constant x. The Master Theorem then gives us the complexity depending on whether certain relations between the coefficients hold.

Here, the case applies where there exists a c such that f(n) = O(n^c) and c < log_b a, with c = 0 (f(n) = O(1) = O(n^0) and 0 < log_2 2 <=> 0 < 1). Therefore, the Master Theorem tells us that T(n) = Theta(n^(log_b a)) = Theta(n), so that also T(n) = O(n) holds. In other words, you are right.

Your proof checks out, but to be really convincing you should use the proof by induction technique. You've essentially used induction but have not clearly labeled each step, which makes it a bit hard to follow (and, you did it backwards – you should start with the base case T(1), then show that the property holds for T(n+1) for any n rather than starting with the general case and working towards the base case).

3
  • Why would a (a constant multiplied against the end result) affect the complexity? Isn't that just like adding O(n) to the end result?
    – ebyrob
    CommentedOct 15, 2015 at 20:45
  • Awesome! I'm still studying the Master Theorem, so I used the proof with which I was more comfortable with, but your proof clarified some things even more. I think I'll edit the post with the induction proof you suggested when I return home. Also, I think I'm gonna close this question with your answer, if nothing else happens here.
    – doplumi
    CommentedOct 15, 2015 at 20:50
  • @ebyrob in normal big-o calculus, constants don't change anything. However, a recursive definition of the running time makes this more complicated. You might want to read up on the Master Theorem to get a deeper grasp of the problem. As an intuitive example: T(n) = T(n/2) is obviously smaller than O(n) (in fact, O(log n)) because I halve the problem size at each step – but T(n) = 4·T(n/2) must be larger than O(n) because while I halve the problem size, I also do that work four times which cancels out the halving.
    – amon
    CommentedOct 15, 2015 at 20:55
2

"The algorithm is recursive, and it has this complexity: ..."

From the formulation of your question it seems that the function T(n) is not the algorithm to be analyzed but its complexity. In your proof you show that, indeed, the function T behaves asymptotically as an exponential function.

Then the fallacy could be that you are trying to prove the wrong thing. In other words:

  1. You are right that computing T takes linear time (maybe even less, logarithmic time I guess).
  2. Your professor is right in saying that the complexity of the original algorithm, whose complexity is expressed by T, is exponential.
4
  • This seems more like a comment than an answer, which you've already posted below the question anyway.CommentedOct 15, 2015 at 20:31
  • I expanded on my comment because I think this is the correct answer.
    – Giorgio
    CommentedOct 15, 2015 at 20:33
  • You haven't posted an answer. Not yet, anyway. For this to be an answer you'd have to explain how he went awry.CommentedOct 15, 2015 at 20:33
  • The notation T(n) usually designates the running time of an algorithm, of which Big-O notation condenses the usually relevant properties that allow us to easily compare algorithms. Calculating the Big-O from a recursively defined running time is a common problem.
    – amon
    CommentedOct 15, 2015 at 20:37
0

The algorithm will have execution time of O ((log n)^2), because there will be log n steps doubling a number of n bits. The result is O (n). It is easily shown by complete induction that T (n) ≤ n, assuming that n/2 means floor (n/2).

As far as intuition goes, if T (n) = O (2^n) then T (1000) should be around 2^1000. But then T (1000) = 2 * T (500) which should be around 2^501... Slight difference.

Of course n = O (2^n) by the definition of Big-O.

I might suspect that maybe the recursion isn't actually posted here correctly if what your professor says is so off the mark.

    Start asking to get answers

    Find the answer to your question by asking.

    Ask question

    Explore related questions

    See similar questions with these tags.