Why the error when in 'keys'?? I'm trying to calculate the minimum degree of a graph Operations in undirected graph Let me know what's wrong pls
graph = dict()
def add_node(v):
if v not in graph:
graph[v] = dict()
def add_edge(v1,v2,cost):
if v1 in grafo and v2 in graph:
graph[v1][v2] = cost
graph[v2][v1] = cost
def del_edge(v1,v2):
if v1 in graph and v2 in graph[v1]:
graph[v1].pop(v2)
graph[v2].pop(v1)
def del_node(v):
if v not in graph:
return
for v2 in graph[v]:
graph[v2].pop(v)
graph.pop(v)
def nodes(self):
return iter(self._graph)
def neighbour_count(self, n):
return len(self.graph[n])
n = int(input())
for _ in range(n):
comands = input()
oper,v1,v2,peso,*_ = comands.split(" ") + ["",""]
if oper == 'IV': add_node(v1)
elif oper == 'IA': add_edge(v1,v2,float(peso))
elif oper == 'RV': del_node(v1)
elif oper == 'RA': del_edge(v1,v2)
keys = [i for i in graph.nodes()]
degrees = [neighbour_count(i) for i in keys]
print(min(degrees))
Example input:
7
IV A
IV B
IA A B
IV C
IA A C
IA C B
RA A B
output:
1
os vértices A e B só tem uma aresta enquanto C tem duas
graph
is a dictionary, and dictionaries don't have a nodes method. So the actual mistake would be in the definition of keys,
[i for i in graph.nodes()]
.
You've done def nodes(self): return iter(self._graph)
as if you're writing methods for a custom class, but there are no custom classes in your code, you're just using an instance of the native dict
class. Which again, doesn't have a nodes method.
You have a couple of good options here:
Other options include inheriting from dictionary or implementing a hash table from scratch, neither of which are ideal.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.