简体   繁体   中英

time complexity of the recursive algorithm

Can someone please explain to me how to calculate the complexity of the following recursive code:

long bigmod(long b, long p, long m) {
    if (p == 0)
        return 1;
    else
    if (p % 2 == 0)
        return square(bigmod(b, p / 2, m)) % m;
    else    
        return ((b % m) * bigmod(b, p - 1, m)) % m;
}

This is O(log(p)) because you are dividing by 2 every time or subtracting one then dividing by two, so the worst case would really take O(2 * log(p)) - one for the division and one for the subtraction of one.

Note that in this example the worst case and average case should be the same complexity.

If you want to be more formal about it then you can write a recurrence relation and use the Master theorem to solve it. http://en.wikipedia.org/wiki/Master_theorem

它是o(log(N))base 2,因为除以2

It runs in O(log n)

There are no expensive operations (by that i more expensive than squaring or modding. no looping, etc) inside the function, so we can pretty much just count the function calls.

Best case is a power of two, we will need exactly log(n) calls.

Worst case we get an odd number on every other call. This can do no more than double our calls. Multiplication by a constant factor, no worse asymptotically. 2*f(x) is still O(f(x))

O(logn)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM