[英]Iteratively compute subtree sizes for all nodes?
我正在尝试创建一个迭代版本:
def computeSize(id):
subtreeSize[id] = 1
for child in children[id]:
computeSize(child)
subtreeSize[id]+=subtreeSize[child]
“迭代”表示没有递归,因为在Python中,如果您的图很大并且在任何地方都有较长的线性链,则会产生堆栈递归错误。
尝试为此使用堆栈(通过DFS算法对其建模),但是我在细节方面遇到困难:
def computeSubtreeSizes(self): #self.sizes[nodeID] has size of subtree
stack = [self.rootID] #e.g. rootID = 1
visited = set()
while stack:
nodeID = stack.pop()
if nodeID not in visited:
visited.add(nodeID)
for nextNodeID in self.nodes[nodeID]:
stack.append(nextNodeID)
例如,一旦我开始,我显然将根ID从堆栈中弹出,但是在那之后,我基本上在子循环之后“丢失”了ID,并且以后无法分配其大小。
我是否需要第二个堆栈?
未经测试-考虑此伪代码的概念是要处理一堆节点,并且在每个节点上有相应的堆栈其直接子节点尚未处理。 这意味着主堆栈上的每个项目都是一个元组-元组中的第一个项目是节点,第二个项目是未处理的子节点列表。
def computeSubtreeSizes(self):
stack = [(self.rootID, [])] #e.g. rootID = 1
visited = self.sizes = {}
while stack:
nodeID, subnodes = stack[-1]
size = visited.get(nodeID)
if size is None:
# Haven't seen it before. Set total to 1,
# and set up the list of subnodes.
visited[nodeID] = size = 1
subnodes[:] = self.nodes[nodeID]
if subnodes:
# Process all the subnodes one by one
stack.append((subnodes.pop(), []))
else:
# When finished, update the parent
stack.pop()
if stack:
visited[stack[-1][0]] += size
明显的潜在性能增强将是:不必费心添加已被访问到主堆栈的节点。 仅在重复的子树非常普遍时才有用。 这是更多代码(可读性较差),但可能看起来像这样:
def computeSubtreeSizes(self):
stack = [(self.rootID, [])] #e.g. rootID = 1
visited = self.sizes = {}
while stack:
nodeID, subnodes = stack[-1]
size = visited.get(nodeID)
if size is None:
# Haven't seen it before. Add totals of
# all previously visited subnodes, and
# add the others to the list of nodes to
# be visited.
size = 1
for sn in self.nodes[nodeID]:
sn_size = visited.get(sn)
if sn_size is None:
subnodes.append(sn)
else:
size += sn_size
visited[nodeID] = size
if subnodes:
# Process all the subnodes one by one
stack.append((subnodes.pop(), []))
else:
# When finished, update the parent
stack.pop()
if stack:
visited[stack[-1][0]] += size
编辑(尤其是测试后问题作者的编辑)当然是受欢迎的。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.