简体   繁体   English

在算法时间复杂度上总是 O(1) 优于 O(n)?

[英]Always O(1) is better than O(n) in algorithm time complexity?

I am asking that just think for a moment if I have an algorithm which is taking O(100) ---->O(1) time complexity and I have an algorithm for the same problem which takes O(n) to solve but if I know that it's n's maximum value is 50 then I can decide it's worst case is O(50) so in a case like this still O(1) algorithm or the second O(n) algorithm is the best choice?我想问一下,如果我有一个算法需要 O(100) ----> O(1) 时间复杂度,并且我有一个算法需要 O(n) 来解决同样的问题,但是如果我知道 n 的最大值是 50,那么我可以决定它的最坏情况是 O(50) 所以在这种情况下仍然是 O(1) 算法还是第二个 O(n) 算法是最好的选择? So if it's the second one, can we always tell O(1) is better than O(n) ?所以如果是第二个,我们能不能总说 O(1) 比 O(n) 好?

Sure, not;一定不; not always .总是 Big O is just an asymptotic behaviour , that's why大 O只是一种渐近行为,这就是为什么

O(1) == O(0.001) == O(50) == O(100) == O(C) # where C is any positive constant

same for O(n)相同的O(n)

O(n) == O(0.001n) == O(100n) == O(C * n)    # where C is any positive constant

imaging two algorithms with timings使用时序对两种算法进行成像

t1 = 1e100 (seconds) = O(1)
t2 = n     (seconds) = O(n)

for the infinite n (asymptotic behaviour) the 1st algorithm is better than 2nd, but for all real world cases (small n ) t2 is preferable.对于无限n (渐近行为),第一种算法优于第二种算法,但对于所有真实世界的情况(小nt2更可取。 Even scaling is not enough:甚至缩放还不够:

t1 = 1000               (seconds)
t2 = 100 * n            (seconds)
t3 = n + 1e100 * log(n) (seconds)

Algorithm 3 has a better scaling ( 1 vs. 100 : n vs 100 * n ) but 1e100 * log(n) term makes it impossible to finish in real world cases.算法 3 具有更好的缩放比例( 1 vs. 100 : n vs 100 * n ),但1e100 * log(n)项使其无法在现实世界中完成。

So instead of O in general case you should compare functions :因此,在一般情况下,您应该比较函数而不是O

t1 = 100 (seconds)
t2 = n   (seconds)

here if n <= 50 then t2 is a better choice (and for n > 1000 we have quite an opposite)在这里,如果n <= 50那么t2是更好的选择(对于n > 1000我们正好相反)

Two algorithms:两种算法:

100n -> O(n)

10n² -> O(n²)

If n < 10 the quadratic time algorithm is better.如果 n < 10,二次时间算法更好。 If n > 10 the linear time algorithm is better.如果 n > 10,线性时间算法更好。

There are also practical use cases.也有实际用例。 Quick sort algorithms (avg: O(n log(n))) often uses insertion sort (avg: O(n²)) in case given data collection is small enough.快速排序算法 (avg: O(n log(n))) 通常使用插入排序 (avg: O(n²)),以防给定的数据集合足够小。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM