简体   繁体   中英

Solving a Recurrence Relation: T(n)=T(n-1)+T(n/2)+n

Solve: T(n)=T(n-1)+T(n/2)+n .

I tried solving this using recursion trees.There are two branches T(n-1) and T(n/2) respectively. T(n-1) will go to a higher depth. So we get O(2^n) . Is this idea correct?

This is a very strange recurrence for a CS class. This is because from one point of view: T(n) = T(n-1) + T(n/2) + n is bigger than T(n) = T(n-1) + n which is O(n^2).

But from another point of view, the functional equation has an exact solution : T(n) = -2(n + 2) . You can easily see that this is the exact solution by substituting it back to the equation: -2(n + 2) = -2(n + 1) + -(n + 2) + n . I am not sure whether this is the only solution.

Here is how I got it: T(n) = T(n-1) + T(n/2) + n . Because you calculate things for very big n , than n-1 is almost the same as n . So you can rewrite it as T(n) = T(n) + T(n/2) + n which is T(n/2) + n = 0 , which is equal to T(n) = - 2n , so it is linear. This was counter intuitive to me (the minus sign here), but armed with this solution, I tried T(n) = -2n + a and found the value of a.

I believe you are right. The recurrence relation will always split into two parts, namely T(n-1) and T(n/2). Looking at these two, it is clear that n-1 decreases in value slower than n/2, or in other words, you will have more branches from the n-1 portion of the tree. Despite this, when considering big-o, it is useful to just consider the 'worst-case' scenario, which in this case is that both sides of the tree decreases by n-1 (since this decreases more slowly and you would need to have more branches). In all, you would need to split the relation into two a total of n times, hence you are right to say O(2^n).

Your reasoning is correct, but you give away far too much. (For example, it is also correct to say that 2x^3+4=O(2^n) , but that's not as informative as 2x^3+4=O(x^3) .)

The first thing we want to do is get rid of the inhomogeneous term n . This suggests that we may look for a solution of the form T(n)=an+b . Substituting that in, we find:

an+b = a(n-1)+b + an/2+b + n

which reduces to

0 = (a/2+1)n + (b-a)

implying that a=-2 and b=a=-2 . Therefore, T(n)=-2n-2 is a solution to the equation.

We now want to find other solutions by subtracting off the solution we've already found. Let's define U(n)=T(n)+2n+2 . Then the equation becomes

U(n)-2n-2 = U(n-1)-2(n-1)-2 + U(n/2)-2(n/2)-2 + n

which reduces to

U(n) = U(n-1) + U(n/2).

U(n)=0 is an obvious solution to this equation, but how do the non-trivial solutions to this equation behave?

Let's assume that U(n)∈Θ(n^k) for some k>0 , so that U(n)=cn^k+o(n^k) . This makes the equation

cn^k+o(n^k) = c(n-1)^k+o((n-1)^k) + c(n/2)^k+o((n/2)^k)

Now, (n-1)^k=n^k+Θ(n^{k-1}) , so that the above becomes

cn^k+o(n^k) = cn^k+Θ(cn^{k-1})+o(n^k+Θ(n^{k-1})) + cn^k/2^k+o((n/2)^k)

Absorbing the lower order terms and subtracting the common cn^k , we arrive at

o(n^k) = cn^k/2^k

But this is false because the right hand side grows faster than the left. Therefore, U(n-1)+U(n/2) grows faster than U(n) , which means that U(n) must grow faster than our assumed Θ(n^k) . Since this is true for any k , U(n) must grow faster than any polynomial.

A good example of something that grows faster than any polynomial is an exponential function. Consequently, let's assume that U(n)∈Θ(c^n) for some c>1 , so that U(n)=ac^n+o(c^n) . This makes the equation ac^n+o(c^n) = ac^{n-1}+o(c^{n-1}) + ac^{n/2}+o(c^{n/2}) Rearranging and using some order of growth math, this becomes

c^n = o(c^n)

This is false (again) because the left hand side grows faster than the right. Therefore, U(n) grows faster than U(n-1)+U(n/2) , which means that U(n) must grow slower than our assumed Θ(c^n) . Since this is true for any c>1 , U(n) must grow more slowly than any exponential.

This puts us into the realm of quasi-polynomials, where ln U(n)∈O(log^cn) , and subexponentials, where ln U(n)∈O(n^ε) . Either of these mean that we want to look at L(n):=ln U(n) , where the previous paragraphs imply that L(n)∈ω(ln n)∩o(n) . Taking the natural log of our equation, we have

ln U(n) = ln( U(n-1) + U(n/2) ) = ln U(n-1) + ln(1+ U(n/2)/U(n-1))

or

L(n) = L(n-1) + ln( 1 + e^{-L(n-1)+L(n/2)} ) = L(n-1) + e^{-(L(n-1)-L(n/2))} + Θ(e^{-2(L(n-1)-L(n/2))})

So everything comes down to: how fast does L(n-1)-L(n/2) grow? We know that L(n-1)-L(n/2)→∞ , since otherwise L(n)∈Ω(n) . And it's likely that L(n)-L(n/2) will be just as useful, since L(n)-L(n-1)∈o(1) is much smaller than L(n-1)-L(n/2) .

Unfortunately, this is as far as I'm able to take the problem. I don't see a good way to control how fast L(n)-L(n/2) grows (and I've been staring at this for months). The only thing I can end with is to quote another answer: “a very strange recursion for a CS class”.

I think we can look at it this way:

T(n)=2T(n/2)+n < T(n)=T(n−1)+T(n/2)+n < T(n)=2T(n−1)+n

If we apply the master's theorem, then:

Θ(n∗logn) < Θ(T(n)) < Θ(2n)

Remember that T(n) = T(n-1) + T(n/2) + n being ( asymptotically ) bigger than T(n) = T(n-1) + n only applies for functions which are asymptotically positive. In that case, we have T = Ω(n^2) .

Note that T(n) = -2(n + 2) is a solution to the functional equation, but it doesn't interest us, since it is not an asymptotically positive solution, hence the notations of O don't have meaningful application.

You can also easily check that T(n) = O(2^n) . (Refer to yyFred solution, if needed)

If you try using the definition of O for functions of the type n^a(lgn)^b , with a(>=2) and b positive constants, you see that this is not a possible solution too by the Substitution Method.

In fact, the only function that allows a proof with the Substitution Method is exponential, but we know that this recursion doesn't grow as fast as T(n) = 2T(n-1) + n , so if T(n) = O(a^n) , we can have a < 2 .

Assume that T(m) <= c(a^m) , for some constant c, real and positive. Our hypothesis is that this relation is valid for all m < n. Trying to prove this for n, we get:

T(n) <= (1/a+1/a^(n/2))c(a^n) + n

we can get rid of the n easily by changing the hypothesis by a term of lower order. What is important here is that:

1/a+1/a^(n/2) <= 1

a^(n/2+1)-a^(n/2)-a >= 0

Changing variables:

a^(N+1)-a^N-a >= 0

We want to find a bond as tight as possible, so we are searching for the lowest a possible. The inequality we found above accept solutions of a which are pretty close to 1, but is a allowed to get arbitrarily close to 1? The answer is no, let a be of the form a = (1+1/N) . Substituting a at the inequality and applying the limit N -> INF :

e-e-1 >= 0

which is a absurd. Hence, the inequality above has some fixed number N* as maximum solution, which can be found computationally. A quick Python program allowed me to find that a < 1+1e-45 (with a little extrapolation), so we can at least be sure that:

T(n) = ο((1+1e-45)^n)

T(n)=T(n-1)+T(n/2)+n is the same as T(n)=T(n)+T(n/2)+n since we are solving for extremely large values of n. T(n)=T(n)+T(n/2)+n can only be true if T(n/2) + n = 0 . That means T(n) = T(n) + 0 ~= O(n)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM