[英]Scraping multiple pages into list with beautifulsoup
我在Python中使用beautifulsoup4编写了一个刮板程序,该程序对多页加密货币值进行迭代,并返回开始值,最高值和结束值。 问题的抓取部分工作正常,但无法将其保存到我的列表中,只有最后一个添加到了列表中。
谁能帮我解决所有的问题? 我已经进行了数小时的搜索,但似乎找不到相关的答案。 代码如下:
no_space = name_15.str.replace('\s+', '-')
#lists out the pages to scrape
for n in no_space:
page = 'https://coinmarketcap.com/currencies/' + n + '/historical-data/'
http = lib.PoolManager()
response = http.request('GET', page)
soup = BeautifulSoup(response.data, "lxml")
main_table = soup.find('tbody')
date=[]
open_p=[]
high_p=[]
low_p=[]
close_p=[]
table = []
for row in main_table.find_all('td'):
table_pull = row.find_all_previous('td') #other find methods aren't returning what I need, but this works just fine
table = [p.text.strip() for p in table_pull]
date = table[208:1:-7]
open_p = table[207:1:-7]
high_p = table[206:1:-7]
low_p = table[205:1:-7]
close_p = table[204:0:-7]
df=pd.DataFrame(date,columns=['Date'])
df['Open']=list(map(float,open_p))
df['High']=list(map(float,high_p))
df['Low']=list(map(float,low_p))
df['Close']=list(map(float,close_p))
print(df)
简而言之,看起来您正在访问所有“ td”元素,然后尝试访问该列表的先前元素,这是不必要的。 另外,正如@hoefling指出的那样,您不断在循环内覆盖变量,这就是为什么您只返回列表中的最后一个元素的原因(换句话说,只有循环的最后一次迭代才设置值)该变量的所有先前变量均被覆盖)。 抱歉,由于我的计算机上有防火墙,我目前无法对此进行测试。 请尝试以下操作:
no_space = name_15.str.replace('\s+', '-')
#lists out the pages to scrape
for n in no_space:
page = 'https://coinmarketcap.com/currencies/' + n + '/historical-data/'
http = lib.PoolManager()
response = http.request('GET', page)
soup = BeautifulSoup(response.data, "lxml")
main_table = soup.find('tbody')
table = [p.text.strip() for p in main_table.find_all('td')]
#You will need to re-think these indices here to get the info you want
date = table[208:1:-7]
open_p = table[207:1:-7]
high_p = table[206:1:-7]
low_p = table[205:1:-7]
close_p = table[204:0:-7]
df=pd.DataFrame(date,columns=['Date'])
df['Open']=list(map(float,open_p))
df['High']=list(map(float,high_p))
df['Low']=list(map(float,low_p))
df['Close']=list(map(float,close_p))
print(df)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.