简体   繁体   中英

What is the time complexity of an iteration through all possible sequences of an array

An algorithm that goes through all possible sequences of indexes inside an array.

Time complexity of a single loop and is linear and two nested loops is quadratic O(n^2). But what if another loop is nested and goes through all indexes separated between these two indexes? Does the time complexity rise to cubic O(n^3)? When N becomes very large it doesn't seem that there are enough iterations to consider the complexity cubic yet it seems to big to be quadratic O(n^2)

Here is the algorithm considering N = array length

for(int i=0; i < N; i++)
{
    for(int j=i; j < N; j++)
    {
        for(int start=i; start <= j; start++)
        {
          //statement
        }
    }
}

Here is a simple visual of the iterations when N=7(which goes on until i=7):

在此输入图像描述 在此输入图像描述

And so on..

Should we consider the time complexity here quadratic, cubic or as a different size complexity?

For the basic

for (int i = 0; i < N; i++) {
    for (int j = i; j < N; j++) {
        // something
    }
}

we execute something n * (n+1) / 2 times => O(n^2) . As to why: it is the simplified form of
sum (sum 1 from y=x to n) from x=1 to n .

For your new case we have a similar formula:
sum (sum (sum 1 from z=x to y) from y=x to n) from x=1 to n . The result is n * (n + 1) * (n + 2) / 6 => O(n^3) => the time complexity is cubic .

The 1 in both formulas is where you enter the cost of something . This is in particular where you extend the formula further.

Note that all the indices may be off by one, I did not pay particular attention to < vs <= , etc.

Short answer, O(choose(N+k, N)) which is the same as O(choose(N+k, k)) .


Here is the long answer for how to get there.

You have the basic question version correct. With k nested loops, your complexity is going to be O(N^k) as N goes to infinity. However as k and N both vary, the behavior is more complex.

Let's consider the opposite extreme. Suppose that N is fixed, and k varies. If N is 0, your time is constant because the outermost loop fails on the first iteration.. If N = 1 then your time is O(k) because you go through all of the levels of nesting with only one choice and only have one choice every time. If N = 2 then something more interesting happens, you go through the nesting over and over again and it takes time O(k^N) . And in general, with fixed N the time is O(k^N) where one factor of k is due to the time taken to traverse the nesting, and O(k^(N-1)) being taken by where your sequence advances. This is an unexpected symmetry!

Now what happens if k and N are both big? What is the time complexity of that? Well here is something to give you intuition.

Can we describe all of the times that we arrive at the innermost loop? Yes! Consider k+N-1 slots With k of them being "entered one more loop" and N-1 of them being "we advanced the index by 1". I assert the following:

  1. These correspond 1-1 to the sequence of decisions by which we reached the innermost loop. As can be seen by looking at which indexes are bigger than others, and by how much.
  2. The "entered one more loop" entries at the end is work needed to get to the innermost loop for this iteration that did not lead to any other loop iterations.
  3. If 1 < N we actually need one more that that in unique work to get to the end.

Now this looks like a mess, but there is a trick that simplifies it quite unexpectedly.

The trick is this. Suppose that we took one of those patterns and inserted one extra "we advanced the index by 1" somewhere in that final stretch of "entered one more loop" entries at the end. How many ways are there to do that? The answer is that we can insert that last entry in between any two spots in that last stretch, including beginning and end, and there is one more way to do that than there are entries. In other words, the number of ways to do that matches how much unique work there was getting to this iteration!

And what that means is that the total work is proportional to O(choose(N+k, N)) which is also O(choose(N+k, k)) .

It is worth knowing that from the normal approximation to the binomial formula, if N = k then this turns out to be O(2^(N+k)/sqrt(N+k)) which indeed grows faster than polynomial. If you need a more general or precise approximation, you can use Stirling's approximation for the factorials in choose(N+k, N) = (N+k)! / ( N! k! ) choose(N+k, N) = (N+k)! / ( N! k! ) .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM