简体   繁体   English

NetworkX:如何正确创建边长字典?

[英]NetworkX: how to properly create a dictionary of edge lengths?

Say I have a regular grid network made of 10x10 nodes which I create like this: 假设我有一个由10x10节点组成的常规网格网络,其创建方式如下:

import networkx as nx
from pylab import *
import matplotlib.pyplot as plt
%pylab inline

ncols=10 

N=10 #Nodes per side
G=nx.grid_2d_graph(N,N)
labels = dict( ((i,j), i + (N-1-j) * N ) for i, j in G.nodes() )
nx.relabel_nodes(G,labels,False)
inds=labels.keys()
vals=labels.values()
inds=[(N-j-1,N-i-1) for i,j in inds]
posk=dict(zip(vals,inds))
nx.draw_networkx(G, pos=posk, with_labels=True, node_size = 150, node_color='blue',font_size=10)
plt.axis('off')
plt.title('Grid')
plt.show()

Now say I want to create a dictionary which stores, for each edge, its length. 现在说我想创建一个字典,为每个边缘存储其长度。 This is the intended outcome: 这是预期的结果:

d={(0,1): 3.4, (0,2): 1.7, ...}

And this is how I try to get to that point: 这就是我试图达到的目的:

from math import sqrt

lengths={G.edges(): math.sqrt((x-a)**2 + (y-b)**2) for (x,y),(a,b) in G.edges()}

But there clearly is something wrong as I get the following error message: 但是当我收到以下错误消息时,显然有问题:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-7-c73c212f0d7f> in <module>()
      2 from math import sqrt
      3 
----> 4 lengths={G.edges(): math.sqrt((x-a)**2 + (y-b)**2) for (x,y),(a,b) in G.edges()}
      5 
      6 

<ipython-input-7-c73c212f0d7f> in <dictcomp>(***failed resolving arguments***)
      2 from math import sqrt
      3 
----> 4 lengths={G.edges(): math.sqrt((x-a)**2 + (y-b)**2) for (x,y),(a,b) in G.edges()}
      5 
      6 

TypeError: 'int' object is not iterable

What am I missing? 我想念什么?

There is a lot going wrong in the last line, first and foremost that G.edges() is an iterator and not a valid dictionary key, and secondly, that G.edges() really just yields the edges, not the positions of the nodes. 最后一行有很多错误,首先是G.edges()是迭代器,而不是有效的字典键,其次,G.edges()实际上只是产生边缘,而不是产生位置。节点。

This is what you want instead: 这是您想要的:

lengths = dict()
for source, target in G.edges():
    x1, y1 = posk[source]
    x2, y2 = posk[target]
    lengths[(source, target)] = math.sqrt((x2-x1)**2 + (y2-y1)**2)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM