简体   繁体   中英

Asymptotic run time complexity of an expression

Can I say that:

log n + log (n-1) + log (n-2) + .... + log (n - k) = theta(k * log n)?

Formal way to write the above:

Sigma (i runs from 0 to k) log (ni) = theta (k* log n)?

If the above statement is right, how can I prove it?

If it is wrong, how can I express it (the left side of the equation, of course) as an asymptotic run time function of n and k?

Thanks.

Denote:

LHS = log(n) + log(n-1) + ... + log(nk)

RHS = k * log n

Note that:

LHS = log(n*(n-1)*...*(nk)) = log(polynomial of (k+1)th order)

It follows that this is equal to:

(k+1)*log(n(1 + terms that are 0 in limit))

If we consider a division:

(k+1)*log(n(1 + terms that are 0 in limit)) / RHS

we get in limit:

(k+1)/k = 1 + 1/k

So if k is a constant, both terms grow equally fast. So LHS = theta(RHS) .

Wolfram Alpha seems to agree.

When n is constant, terms that previously were 0 in limit don't disappear but instead you get:

(k+1) * some constant number / k * (some other constant number)

So it's:

(1 + 1/k)*(another constant number) . So also LHS = theta(RHS) .

When proving Θ , you want to prove O and Ω .

Upper bound is proven easily:
log(n(n-1)...(nk)) ≤ log(n^k) = k log n = O(k log n)

For the lower bound, if k ≥ n/2 , then in the product there is n/2 terms greater than n/2 :
log(n(n-1)...(nk)) ≥ (n/2)log(n/2) = Ω(n log n) ≥ Ω(k log n)

and if k ≤ n/2 , all terms are greater than n/2 :
log(n(n-1)...(nk)) ≥ log((n/2)^k) = k log(n/2) = Ω(k log n)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM