简体   繁体   中英

How to calculate exponent using only arithmetic operations in constant time?

I'm trying to find a way to loop through an array of integers of size N and multiply each of those integers by 128^((N-1) - i), where N is the length of the array and i is the index of the integer, and then adding all those results together.

For example, an array input of [1, 2, 3, 4] would return 1 * (128^3) + 2 * (128^2) + 3 * (128^1) + 4 * (128^0).

My algorithm needs to run in O(N) time, but the exponent operation is expensive, as, for example, 2^3 takes three operations. So, I need to find a way to operate on each integer in the array in O(1) time, using only arithmetic operations (-, +, *, /, %). The most obvious (incorrect) way I could think of is simply multiplying each integer (Ni) times, but that does not take constant time. I was also thinking of using exponentiation by squaring, but this takes log_2(Ni) time for operating on each integer, which is not constant.

128 is 2^7, and multiplying a number by 128^k shifts its binary representation left by 7*k positions.

1 * (128^3) + 2 * (128^2) + 3 * (128^1) + 4 * (128^0)
= 1000000000000000000000 + 1000000000000000 + 110000000 + 100 

To answer the title question: it's possible to prove that with a constant number of those operations, you can't make numbers big enough for sufficiently large exponents.

To answer the underlying question: you can use the polynomial evaluation method sometimes attributed to Horner: ((1 * 128 + 2) * 128 + 3) * 128 + 4 . Note that unless you're modding by something, manipulating the bignums is still going to cost you Õ(n 2 ) time.

If you are indeed working with bignums, there's a more complicated divide and conquer method that should be faster assuming that bignum multiplication runs faster than the school method. The idea is to split the input in half, evaluate the lower and upper halves separately using recursion, and then put them together. On your example, this looks like

(1 * 128 + 2) * 128^2 + (3 * 128 + 4),

where we compute the term 128^2 (ie, 128^(n/2) ) by repeated squaring. The operation count is still O(n) since we have the recurrence

T(n) = 2 T(n/2) + O(log n),

which falls into Case 1 . In practice, the running time will be dominated by the large multiplications, with whatever asymptotic complexity the particular implementation has.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM