简体   繁体   中英

Time complexity of loop with multiple inner loops

for (int i = 0; i < n; ++i ) { //n   
  for (int j = 0; j < i; ++j) { //n
    cout<< i* j<<endl;
    cout<< ("j = " + j);  
  }    
  for (int k = 0; k < n * 3; ++k) //n?
    cout<<"k = " + k);     
} 

In this loop I see that the first for loop is O(n), the second loop is also O(n) but the 3rd for loop is confusing for me. K being less than something expanding would this also be O(n) for this loop? If so, what does two loops within another loop's time complexity come out to be in this context? I am assuming O(n^2) due to the two n's in the middle not being multiplied in any way. Is this correct? Also if I'm correct and the second loop is O(n), what would the time complexity be if it was O(logn)?

(Not homework, simply for understanding purposes)

A good rule of thumb for big-O notation is the following:

When in doubt, work inside-out!

Here, let's start by analyzing the two inner loops and then work outward to get the overall time complexity. The two inner loops are shown here:

for (int j = 0; j < i; ++j) {
  cout<< i* j<<endl;
  cout<< (”j = ” + j);  
}    
for (int k = 0; k < n * 3; ++k)
  cout<<”k = ” + k);  

The first loop runs O(i) times and does O(1) work per iteration, so it does O(i) total work. That second loop runs O(n) times (it runs 3n times, and since big-O notation munches up constants, that's O(n) total times) and does O(1) work per iteration, so it does O(n) total work. This means that your overall loop can be rewritten as

for (int i = 0; i < n; ++i) {
    do O(i) work; 
    do O(n) work;
}

If you do O(i) work and then do O(n) work, the total work done is O(i + n), so we can rewrite this even further as

for (int i = 0; i < n; ++i) {
    do O(i + n) work; 
}

If we look at the loop bounds here, we can see that i ranges from 0 up to n-1, so i is never greater than n. As a result, the O(i + n) term is equivalent to an O(n) term, since i + n = O(n). This makes our overall loop

for (int i = 0; i < n; ++i) {
    do O(n) work; 
}

From here, it should be a bit clearer that the overall runtime is O(n 2 ), so we do O(n) iterations, each of which does O(n) total work.


You asked in a comment in another answer about what would happen if the second of the nested loops only ran O(log n) times instead of O(n) times. That's a great exercise, so let's see what happens if we try that out!

Imagine the code looked like this:

for (int i = 0; i < n; ++i) {  
  for (int j = 0; j < i; ++j) {
    cout<< i* j<<endl;
    cout<< ("j = " + j);  
  }    
  for (int k = 0; k < n; k *= 2)
    cout<<"k = " + k);     
} 

Here, the second loop runs only O(log n) times because k grows geometrically. Let's again apply the idea of working from the inside out. The inside now consists of these two loops:

  for (int j = 0; j < i; ++j) {
    cout<< i* j<<endl;
    cout<< ("j = " + j);  
  }    
  for (int k = 0; k < n; k *= 2)
    cout<<"k = " + k); 

Here, that first loop runs in time O(i) (as before) and the new loop runs in time O(log n), so the total work done per iteration is O(i + log n). If we rewrite our original loops using this, we get something like this:

for (int i = 0; i < n; ++i) {  
  do O(i + log n) work;    
}

This one is a bit trickier to analyze, because i changes from one iteration of the loop to the next. In this case, it often helps to approach the analysis not by multiplying the work done per iteration by the number of iterations, but rather by just adding up the work done across the loop iterations. If we do this here, we'll see that the work done is proportional to

(0 + log n) + (1 + log n) + (2 + log n) + ... + (n-1 + log n).

If we regroup these terms, we get

(0 + 1 + 2 + ... + n - 1) + (log n + log n + ... + log n) (n times)

That simplifies to

(0 + 1 + 2 + ... + n - 1) + n log n

That first part of the summation is Gauss's famous sum 0 + 1 + 2 + ... + n - 1, which happens to be equal to n(n-1) / 2. (It's good to know this!) This means we can rewrite the total work done as

n(n - 1) / 2 + n log n

= O(n 2 ) + O(n log n)

= O(n 2 )

with that last step following because O(n log n) is dominated by the O(n 2 ) term.

Hopefully this shows you both where the result comes from and how to come up with it. Work from the inside out, working out how much work each loop does and replacing it with a simpler "do O(X) work" statement to make things easier to follow. When you have some amount of work that changes as a loop counter changes, sometimes it's easiest to approach the problem by bounding the value and showing that it never leaves some range, and other times it's easiest to solve the problem by explicitly working out how much work is done from one loop iteration to the next.

O n squared; calculate the area of a triangle.

We get 1+2+3+4+5+...+n, which is the nth triangular number. If you graph it, it is basically a triangle of height and width n.

A triangle with base n and height n has area 1/2 n^2. O doesn't care about constants like 1/2.

When you have multiple loops in sequence, the time complexity of all of them is the worst complexity of any of them. Since both of the inner loops are O(n), the worst is also O(n).

So since you have O(n) code inside an O(n) loop, the total complexity of everything is O(n 2 ).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM