[英]RuntimeError: maximum recursion depth exceeded with Python 3.2 pickle.dump
I'm getting the above error with the code below. 我使用下面的代码得到上述错误。 The error occurs at the last line. 错误发生在最后一行。 Please excuse the subject matter, I'm just practicing my python skills. 请原谅主题,我只是练习我的蟒蛇技能。 =) =)
from urllib.request import urlopen
from bs4 import BeautifulSoup
from pprint import pprint
from pickle import dump
moves = dict()
moves0 = set()
url = 'http://www.marriland.com/pokedex/1-bulbasaur'
print(url)
# Open url
with urlopen(url) as usock:
# Get url data source
data = usock.read().decode("latin-1")
# Soupify
soup = BeautifulSoup(data)
# Find move tables
for div_class1 in soup.find_all('div', {'class': 'listing-container listing-container-table'}):
div_class2 = div_class1.find_all('div', {'class': 'listing-header'})
if len(div_class2) > 1:
header = div_class2[0].find_all(text=True)[1]
# Take only moves from Level Up, TM / HM, and Tutor
if header in ['Level Up', 'TM / HM', 'Tutor']:
# Get rows
for row in div_class1.find_all('tbody')[0].find_all('tr'):
# Get cells
cells = row.find_all('td')
# Get move name
move = cells[1].find_all(text=True)[0]
# If move is new
if not move in moves:
# Get type
typ = cells[2].find_all(text=True)[0]
# Get category
cat = cells[3].find_all(text=True)[0]
# Get power if not Status or Support
power = '--'
if cat != 'Status or Support':
try:
# not STAB
power = int(cells[4].find_all(text=True)[1].strip(' \t\r\n'))
except ValueError:
try:
# STAB
power = int(cells[4].find_all(text=True)[-2])
except ValueError:
# Moves like Return, Frustration, etc.
power = cells[4].find_all(text=True)[-2]
# Get accuracy
acc = cells[5].find_all(text=True)[0]
# Get pp
pp = cells[6].find_all(text=True)[0]
# Add move to dict
moves[move] = {'type': typ,
'cat': cat,
'power': power,
'acc': acc,
'pp': pp}
# Add move to pokemon's move set
moves0.add(move)
pprint(moves)
dump(moves, open('pkmn_moves.dump', 'wb'))
I have reduced the code as much as possible in order to produce the error. 为了产生错误,我尽可能地减少了代码。 The fault may be simple, but I can't just find it. 错误可能很简单,但我不能找到它。 In the meantime, I made a workaround by setting the recursion limit to 10000. 与此同时,我通过将递归限制设置为10000来解决了这个问题。
Just want to contribute an answer for anyone else who may have this issue. 只想为可能遇到此问题的其他人提供答案。 Specifically, I was having it with caching BeautifulSoup objects in a Django session from a remote API. 具体来说,我从远程API在Django会话中缓存BeautifulSoup对象。
The short answer is the pickling BeautifulSoup nodes is not supported. 简短的回答是不支持酸洗BeautifulSoup节点。 I instead opted to store the original string data in my object and have an accessor method that parsed it on the fly, so that only the original string data is pickled. 我改为选择将原始字符串数据存储在我的对象中,并且有一个可以动态解析它的访问器方法,这样只会对原始字符串数据进行pickle。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.