简体   繁体   English

如何使用 Python 和 Beautiful Soup 从 flexbox 元素/容器中抓取数据

[英]How to scrape data from flexbox element/container with Python and Beautiful Soup

I am trying to scrape data from a utility website using python, beautiful soup and selenium.我正在尝试使用 python、漂亮的汤和 selenium 从实用程序网站上抓取数据。 The data that I am trying to scrape is stuff like: time, cause, status, etc. When I run a typical page request, parse the page, and parse the data that I am looking for (data in id="OutageListTable"), and print it, the divs and strings are nowhere to be found.我试图抓取的数据是:时间、原因、状态等。当我运行典型的页面请求时,解析页面并解析我要查找的数据(id="OutageListTable" 中的数据) , 并打印出来,div 和字符串都找不到了。 When I inspect the page element, the data is there, but it is in a flex container.当我检查页面元素时,数据在那里,但它在一个 flex 容器中。

This is the code that I am using:这是我正在使用的代码:

from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
import urllib3
from selenium import webdriver

my_url = 'https://www.pse.com/outage/outage-map'

browser = webdriver.Firefox()
browser.get(my_url)

html = browser.page_source
page_soup = soup(html, features='lxml')

outage_list = page_soup.find(id='OutageListTable')
print(outage_list)

browser.quit()

How do you retrieve information that is in a flex/flexbox container?如何检索 flex/flexbox 容器中的信息? I am not finding any resources online to help me figure it out.我没有在网上找到任何资源来帮助我弄清楚。

The data is loaded dynamically via Javascript.数据通过 Javascript 动态加载。 You can use requests module to obtain the data.您可以使用requests模块来获取数据。

For example:例如:

import json
import requests

url = 'https://www.pse.com/api/sitecore/OutageMap/AnonymoussMapListView'

data = requests.get(url).json()

# uncomment this to print all data:
#print(json.dumps(data, indent=4))

for d in data['PseMap']:
    print('{} - {}'.format(d['DataProvider']['PointOfInterest']['Title'], d['DataProvider']['PointOfInterest']['MapType']))
    for info in d['DataProvider']['Attributes']:
        print(info['Name'], info['Value'])
    print('-' * 80)

Prints:印刷:

Bellingham - Outage
Start time 06/02 06:09 PM
Est. restoration time 06/03 06:30 AM
Customers impacted 1
Cause Trees/Vegetation
Status Crew assigned
Last updated 06/02 11:50 PM
--------------------------------------------------------------------------------
Deming - Outage
Start time 06/02 07:10 PM
Est. restoration time 06/03 03:30 AM
Customers impacted 568
Cause Accident
Status Repair crew onsite
Last updated 06/02 11:50 PM
--------------------------------------------------------------------------------
Everest - Outage
Start time 06/02 10:42 AM
Customers impacted 4
Cause Scheduled Outage
Status Repair crew onsite
Last updated 06/02 10:50 AM
--------------------------------------------------------------------------------
Kenmore - Outage
Start time 06/02 09:59 PM
Est. restoration time 05/29 01:00 AM
Customers impacted 2
Cause Scheduled Outage
Status Repair crew onsite
Last updated 06/02 10:05 PM
--------------------------------------------------------------------------------
Kent - Outage
Start time 06/02 06:43 PM
Est. restoration time To Be Determined
Customers impacted 26
Cause Car/Equip Accident
Status Waiting for repairs
Last updated 06/02 10:15 PM
--------------------------------------------------------------------------------
Kent - Outage
Start time 06/02 10:09 PM
Est. restoration time To Be Determined
Customers impacted 13
Cause Under Investigation
Status Repair crew onsite
Last updated 06/02 10:15 PM
--------------------------------------------------------------------------------
Northwest Bellevue - Outage
Start time 06/02 11:28 PM
Est. restoration time To Be Determined
Customers impacted 14
Cause Under Investigation
Status Repair crew onsite
Last updated 06/02 11:30 PM
--------------------------------------------------------------------------------
Pacific - Outage
Start time 06/02 06:19 PM
Est. restoration time 06/03 02:30 AM
Customers impacted 3
Cause Accident
Status Crew assigned
Last updated 06/02 11:00 PM
--------------------------------------------------------------------------------
Woodinville - Outage
Start time 06/02 08:29 PM
Est. restoration time 06/03 03:30 AM
Customers impacted 2
Cause Under Investigation
Status Crew assigned
Last updated 06/03 12:15 AM
--------------------------------------------------------------------------------

You are overthinking the problem.你想太多问题了。 First there is no flexboard container.首先没有弹性板容器。 It's a simple case of assigning the right div class.这是分配正确的 div class 的简单案例。 You should be looking at div class_=col-xs-12 col-sm-6 col-md-4 listView-container您应该查看div class_=col-xs-12 col-sm-6 col-md-4 listView-container

from bs4 import BeautifulSoup
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.common.exceptions import TimeoutException
from time import sleep

# create object for chrome options
chrome_options = Options()
base_url = 'https://www.pse.com/outage/outage-map'

chrome_options.add_argument('disable-notifications')
chrome_options.add_argument('--disable-infobars')
chrome_options.add_argument('start-maximized')
chrome_options.add_argument('user-data-dir=C:\\Users\\username\\AppData\\Local\\Google\\Chrome\\User Data\\Default')
# To disable the message, "Chrome is being controlled by automated test software"
chrome_options.add_argument("disable-infobars")
# Pass the argument 1 to allow and 2 to block
chrome_options.add_experimental_option("prefs", { 
    "profile.default_content_setting_values.notifications": 2
    })
# invoke the webdriver
browser = webdriver.Chrome(executable_path = r'C:/Users/username/Documents/playground_python/chromedriver.exe',
                          options = chrome_options)
browser.get(base_url)
delay = 5 #secods

while True:
    try:
        WebDriverWait(browser, delay)
        print ("Page is ready")
        sleep(5)
        html = browser.execute_script("return document.getElementsByTagName('html')[0].innerHTML")
        #print(html)
        soup = BeautifulSoup(html, "html.parser")
        for item_n in soup.find_all('div', class_='col-xs-12 col-sm-6 col-md-4 listView-container'):
            for item_n_text in item_n.find_all(name="span"):
                print(item_n_text.text)
    except TimeoutException:
        print ("Loading took too much time!-Try again")
# close the automated browser
browser.close()

Cause: 
Accident
Status: 
Crew assigned
Last updated: 
06/02 11:00 PM
9. Woodinville
Start time: 
06/02 08:29 PM
Est. restoration time: 
06/03 03:30 AM
Customers impacted: 
2
Cause: 
Under Investigation
Status: 
Crew assigned
Last updated: 
06/03 12:15 AM
Page is ready
1. Bellingham
Start time: 
06/02 06:09 PM
Est. restoration time: 
06/03 06:30 AM
Customers impacted: 
1
Cause: 
Trees/Vegetation
Status: 
Crew assigned
Last updated: 
06/02 11:50 PM
2. Deming
Start time: 
06/02 07:10 PM
Est. restoration time: 
06/03 03:30 AM

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM