简体   繁体   English

发现一个n之间的区别!和2 ^ n算法

[英]Spotting the difference between an n! and a 2^n algorithm

I've seen a few interesting discussions recently debating whether or not a given ("hard") problem has at-best an 2^n or n! 我最近看到一些有趣的讨论,争论一个给定的(“硬”)问题是否至少有2 ^ n或n! known solution. 已知解决方案

My question is, aside from actually walking through the algorithm and seeing the growth rate, is there a heuristic for quickly spotting one versus the other? 我的问题是,除了实际走过算法并看到增长率之外,还有一种启发式方法可以快速发现一个与另一个相比吗? Ie. IE浏览器。 are there certain quickly-observable properties of an algorithm that make it obviously one or the other? 是否存在一些算法的快速可观察属性,使其显然是一个或另一个?

Related discussions: 相关讨论:

There is no algorithm that can determine a complexity of a program [at all]. 没有算法可以确定程序的复杂性[根本]。 It is a part of the Halting Problem - you cannot determine if a certain algorithm will stop or not. 它是暂停问题的一部分 - 您无法确定某个算法是否会停止。 [You cannot estimate if it is Theta(infinity) or anything less then it] [你不能估计它是Theta(infinity)还是更小的东西]

As a rule of thumb - usually O(n!) algorithms are invoking recursive call in a loop with a decreasing range, while O(2^n) algorithms invoke a recursive call twice in each call. 根据经验 - 通常 O(n!)算法在具有递减范围的循环中调用递归调用,而O(2^n)算法在每次调用中调用两次递归调用。

Note : Not all algorithms that invokes a recursive call twice are O(2^n) - a quicksort is a good example for an O(nlogn) algorithm which also invokes a recursive call twice. 注意 :并非所有调用两次递归调用的算法都是O(2^n) - 快速排序是O(nlogn)算法的一个很好的例子,它也会调用两次递归调用。

EDIT: For example: 编辑:例如:
SAT brute-force solution O(2^n) : SAT强力溶液O(2^n)

SAT(formula,vars,i):
  if i == vars.length:
      return formula.isSatisfied(vars)
  vars[i] = true
  temp = SAT(formula,vars,i+1)  //first recursive call
  if (temp == true) return true
  vars[i] = false
  return SAT(formula,vars,i+1)  //second recursive call

Find all permutations: O(n!) 找到所有排列: O(n!)

permutations(source,sol):
  if (source.length == 0): 
      print sol
      return
  for each e in source: 
      sol.append(e)
      source.remove(e)
      permutations(source,sol) //recursive call in a loop
      source.add(e)
      sol.removeLast()

As amit mentioned it is theoretically not possible to check if an algorithm is O(2^n) or O(n!). 如上所述,理论上不可能检查算法是O(2 ^ n)还是O(n!)。 However you can use the following heuristics: 但是,您可以使用以下启发式方法:

  1. For different values of n calculate the number of steps, F(n), to solve 对于不同的n值,计算要求的步数F(n)
  2. Plot n vs log( F(n) )/n 绘图n vs log(F(n))/ n
  3. If it looks like a flat line (or levels off as a flat line) then it is O(2^n) 如果它看起来像一条扁平线(或平坦的线条),那么它是O(2 ^ n)
  4. If it looks like a strictly increasing function, then is is super exponential 如果它看起来像一个严格增加的函数,那么是超指数
  5. If it looks more line ax vs log(x) plot, then it is "probably" O(n!) 如果它看起来更多线斧vs log(x)图,那么它“可能”O(n!)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 N次幂n即n ^ n是多项式吗? n ^ 2和n ^ n之间有多项式差异吗? - N power n i-e n^n is polynomial or not ? Is there any polynomial difference between n^2 and n^n? while(n> 0)和while(n!= 0)的评估之间的差异 - Difference between evaluation of while(n>0) and while(n!=0) 算法:T(N) =2^N - Algorithm: T(N) =2^N 在O(n)中运行的数组“最大差异”算法? - Array “maximum difference” algorithm that runs in O(n)? O(n)时间复杂度的算法,用于在阵列中找到彼此之间具有最大差异的一对nos - an algorithm in O(n) time complexity to find a pair of nos in an array which have the closest difference between each other O(n ^ 2)和O(n ^ 4)之间交替的算法的摊销运行时成本 - Amortized Runtime Cost for an algorithm alternating between O(n^2) & O(n^4) 使用主法求解T(n)= 2T(n / 2)+ n / log n和T(n)= 4T(n / 2)+ n / log n之间的差异 - Difference between solving T(n) = 2T(n/2) + n/log n and T(n) = 4T(n/2) + n/log n using Master Method 具有O(2 ^ n * n)的背包算法 - Knapsack Algorithm with a O(2^n*n) 复杂度logn和log之间的差异(sqrt(n)) - Difference between complexity logn and log(sqrt(n)) O(1)和O(n)之间的线性搜索差异 - Linear Search difference between O(1) and O(n)
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM