简体   繁体   中英

What is the complexity of multiple runs of an O(n log n) algorithm?

If the problem size is n, and every time an algorithm reduces the problem size by half, I believe the complexity is O (n log n) e..g merge sort. So, basically you are running a (log n) algorithm (the comparison) n times...

Now the problem is, if I have a problem of size n. My algorithm is able to reduce the size by half in a run and each run takes O(n log n). What is the complexity in this case?

If the problem takes n steps at size n, plus an additional run at size floor(n/2) when n > 1, then it takes O(n) time in total: n + n/2 + n/4 + ... =~ 2n = O(n).

Similarly, if each run takes time O(n log n) and an additional run at size floor(n/2) when n > 1, the total time is O(n log n).

Since the size of the problem gets halved in each iteration and at each level the time taken is n log n , the recurrence relation is

T(n) = T(n/2) + n log n

Applying Master theorem,

Comparing with T(n) = a T(n/b) + f(n) , we have a=1 and b=2.

Hence n log b a = n log 2 1 = n 0 = 1.

Thus f(n) = n log n > n log b a .

Applying Master theorem we get T(n) = Θ(f(n)) = Θ(n log n).

Hence the complexity is T(n) = Θ(n log n) .

EDIT after comments:

If the size of the problem halves at every run, you'll have log(n) runs to complete it. Since every run take n*log(n) time, you'll have log(n) times n*log(n) runs. The total complexity will be:

O(n log(n)^2)

I'm pretty sure that comes to O(n^2 log n). You create a geometric series of n + n/2 + n/4 + ... = 2n (for large n). But you ignore the coefficient and just get the n.

This is fine unless you mean the inner nlogn to be the same n value as the outer n.

Edit: I think that what the OP means here is that each run the inner nlogn also gets havled. In other words,

nlogn + n/2 log n/2 + n/4 log n/4 + ... + n/2^(n - 1) log n/2^(n-1)

If this is the case, then one thing to consider is that at some point

2^(n-1) > n

At that point the log breaks down (because log of a number between 0 and 1 is negative). But, you don't really need the log as all there will be is 1 operation in these iterations. So from there on you are just adding 1s.

This occurs at the log n / log 2. So, for the first log n / log 2 iterations, we have the sum as we had above, and after that it is just a sum of 1s.

nlogn + n/2 log n/2 + n/4 log n/4 + ... + n/2^(log n / log 2) log n/2^(log n / log 2) + 1 + 1 + 1 + 1 ... (n - log n / log 2) times

Unfortunately, this expression is not an easy one to simplify...

If I don't misunderstand the question, the first run completes in (proportional to) nlogn. The second run has only n/2 left, so completes in n/2log(n/2), and so on.

For large n, which is what you assume when analyzing the time-complexity, log(n/2) = (logn - log2) is to be replaced by logn.

summing over "all" steps: log(n) * (n + n/2 + n/4 ...) = 2n log(n), ie time complexity nlogn

In other words: the time complexity is the same as for your first/basic step, all others together "only" contributing the same amount once more

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM