[英]"How to fix 'AttributeError: 'NoneType' object has no attribute 'tbody'' error in Python?
I expected a csv file created with in my desktop directory. 我希望在我的桌面目录中创建一个csv文件。
import requests from bs4 import BeautifulSoup import pandas as pd 来自bs4的导入请求以pd形式导入BeautifulSoup导入熊猫
url = "https://basketball.realgm.com/ncaa/conferences/Big-12-
Conference/3/Kansas/54/nba-players"
# get permission
response = requests.get(url)
# access html files
soup = BeautifulSoup(response.text, 'html.parser')
# creating data frame
columns = ['Player', 'Position', 'Height', 'Weight', 'Draft Year', 'NBA
Teams', 'Years', 'Games Played','Points Per Game', 'Rebounds Per Game',
'Assists Per Game']
df = pd.DataFrame(columns=columns)
table = soup.find(name='table', attrs={'class': 'tablesaw','data-
tablesaw-mode':'swipe','id': 'table-6615'}).tbody
trs = table.find('tr')
# rewording html
for tr in trs:
tds = tr.find_all('td')
row = [td.text.replace('\n', '')for td in tds]
df = df.append(pd.Series(row, index=columns), ignore_index=True)
df.to_csv('kansas_player', index=False)
I expected a csv file created with in my desktop directory. 我希望在我的桌面目录中创建一个csv文件。
Looks like by your way the soup.find(...) can not find 'table', and that's might be why you get a None type returned, here is my change and you can tailor it to cope with you csv export need: 看起来按您的方式soup.find(...)找不到'table',这可能就是为什么您返回None类型的原因,这是我的更改,您可以对其进行调整以适应您的csv导出需求:
from bs4 import BeautifulSoup
import urllib.request
url = "https://basketball.realgm.com/ncaa/conferences/Big-12-Conference/3/Kansas/54/nba-players"
# get permission
response = urllib.request.urlopen(url)
# access html files
html = response.read()
soup = BeautifulSoup(html)
table = soup.find("table", {"class": "tablesaw"})
At this point, you can return full table
content as: 此时,您可以按以下方式返回完整的table
内容:
From there on, you can easily extract the table row information by such as: 从那里开始,您可以通过以下方式轻松提取表行信息:
for tr in table.findAll('tr'):
tds = tr.find_all('td')
row = [td.text.replace('\n', '')for td in tds]
.....
Now each row would look like: 现在每一行看起来像:
Finally, you can write each row into the csv with or without the pandas, your call then. 最后,您可以将带有或不带有熊猫的每一行写入csv,然后调用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.