简体   繁体   中英

How to prove the time complexity of quicksort is O(nlogn)

I don't understand the proof given in my textbook that the time complexity of Quicksort is O(n log n). Can anyone explain how to prove it?

Typical arguments that Quicksort's average case running time is O(n log n) involve arguing that in the average case each partition operation divides the input into two equally sized partitions. The partition operations take O(n) time. Thus each "level" of the recursive Quicksort implementation has O(n) time complexity (across all the partitions) and the number of levels is however many times you can iteratively divide n by 2, which is O(log n).

You can make the above argument rigorous in various ways depending on how rigorous you want it and the background and mathematical maturity etc, of your audience. A typical way to formalize the above is to represent the the number of comparisons required by the average case of a Quicksort call as a recurrence relation like

T(n) = O(n) + 2 * T(n/2)

which can be proved to be O(n log n) via the Master Theorem or other means.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM