简体   繁体   中英

Time complexity of a simple algorithm

Hello and sorry for my bad english.

I'm still trying to estimate a complexity of a following algorithm.

There is:

int f = 1, n, x, licznik = 0;
printf("Variable of n: ");
scanf("%d", &n);
printf("Variable of x: ");
scanf("%d", &x);

while(n > 0) {
        if(n%2 == 0) {
                x = x*x;
                n = n/2;
                licznik++;
        }
        else {
                f = f*x;
                n = n-1;
                licznik++;
        }
}

My observation:

When n = / then / licznik =

n = 0 l = 0

n = 10 l = 5

n = 100 l = 9

n = 1000 l = 15

n = 10000 l = 18

n = 1 000 000 l = 26

So it's still growing but very slowly. So it's looks like a "log n" function. It's a good answer that time complexity for this algorithm is O(log n)? What about best option and the worst? Thanks for your help.

PS: "licznik" is a number of multiplications.

If n is an odd number you will always decrement it by one, making it even and guaranteeing that it will get divided into half in the next iteration.

In the worst case you get 2*log n which is O(log n) complexity.

Ok, lets take a wider look..

in your algorithm, once the value is halved and next time its decremented by 1.. since the decrement is almost negligible to the division (for very large values).. we can consider that the value (approximately) is just being divided once in 2 iterations.. Then the time complexity becomes `O(2logn).. which is really O(logn)..

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM