简体   繁体   中英

Big-O runtime of while-loop with nesten if-statements

I am trying to determine the Big-O notation of these code-snippets:

#1:

public static void printProducts (int n) {
    int a = 0; // O(1)
    int b = n; // O(1)

    // O(n)?
    while (a < b){
        // O(?) This if is checked n times, but how many times is it ran?
        if (a * b == n) { 
            System.out.println( a + "*" + b + "=" + a*b ); // O(1)
            a++;                                           // O(1)
            b--;                                           // O(1)
        }
        else if ( a * b > n ) {
            b--;                                           // O(1)
        }
        else if ( a * b < n ) {
            a++;                                           // O(1)
        }
    }
}

#2:

public static void printProducts2 (int n) {
        int a = 1; // O(1)
        int b = n; // O(1)
    
        // O(log n)
        while (a < b){
            if (a * b == n) {
                System.out.println( a + "*" + b + "=" + a*b ); // O(1)
                a++;                                           // O(1)
                b--;                                           // O(1)
            }
            else { 
                if ( a * b > n ) {
                    b = n/a;                                   // O(log n)
                }
                else if ( a * b < n ) {
                    a++;                                       // O(1)
                }
            }
        }
    }

I have concluded that the Big-O notation of the first code is O(n) , and O(log n) for the second, but I'm uncertain if it's correct or not. Am I on the right track here?

I have tried looking at this question, before asking my own question, but I couldn't quite understand how it applies here.

O(n) and O(sqrt(n)) respectively.

The first one is indeed O(n). You rightly say that the while loop runs O(n) times, and everything within the loop is constant time*, so it does not matter how often the if-conditions are true.

The second one is more interesting. You're correct to point out that the faster decrease of b makes the function less complex. In fact, what this function does is increase a stepwise and then sets b to the appropriate value such that a*b==n , if such a b exists. This means that b can only change when a has changed, and so at least half the time that the loop is entered, a is altered. That is, the loop is entered a constant amount of times for each increase in a .

Now we only need to figure out how often a gets increased. The loop stops when a > b . Because b is equal to n/a , this means that the loop stops when a gets larger than the square root of n . Therefore, the function is in O(sqrt(n)).

* Actually the time it takes to divide, multiply and compare numbers can scale with the size of the numbers, but we'll ignore that for now, as it doesn't seem like that is the focus of the question.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM