I am trying to scrape data from a utility website using python, beautiful soup and selenium. The data that I am trying to scrape is stuff like: time, cause, status, etc. When I run a typical page request, parse the page, and parse the data that I am looking for (data in id="OutageListTable"), and print it, the divs and strings are nowhere to be found. When I inspect the page element, the data is there, but it is in a flex container.
This is the code that I am using:
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
import urllib3
from selenium import webdriver
my_url = 'https://www.pse.com/outage/outage-map'
browser = webdriver.Firefox()
browser.get(my_url)
html = browser.page_source
page_soup = soup(html, features='lxml')
outage_list = page_soup.find(id='OutageListTable')
print(outage_list)
browser.quit()
How do you retrieve information that is in a flex/flexbox container? I am not finding any resources online to help me figure it out.
The data is loaded dynamically via Javascript. You can use requests
module to obtain the data.
For example:
import json
import requests
url = 'https://www.pse.com/api/sitecore/OutageMap/AnonymoussMapListView'
data = requests.get(url).json()
# uncomment this to print all data:
#print(json.dumps(data, indent=4))
for d in data['PseMap']:
print('{} - {}'.format(d['DataProvider']['PointOfInterest']['Title'], d['DataProvider']['PointOfInterest']['MapType']))
for info in d['DataProvider']['Attributes']:
print(info['Name'], info['Value'])
print('-' * 80)
Prints:
Bellingham - Outage
Start time 06/02 06:09 PM
Est. restoration time 06/03 06:30 AM
Customers impacted 1
Cause Trees/Vegetation
Status Crew assigned
Last updated 06/02 11:50 PM
--------------------------------------------------------------------------------
Deming - Outage
Start time 06/02 07:10 PM
Est. restoration time 06/03 03:30 AM
Customers impacted 568
Cause Accident
Status Repair crew onsite
Last updated 06/02 11:50 PM
--------------------------------------------------------------------------------
Everest - Outage
Start time 06/02 10:42 AM
Customers impacted 4
Cause Scheduled Outage
Status Repair crew onsite
Last updated 06/02 10:50 AM
--------------------------------------------------------------------------------
Kenmore - Outage
Start time 06/02 09:59 PM
Est. restoration time 05/29 01:00 AM
Customers impacted 2
Cause Scheduled Outage
Status Repair crew onsite
Last updated 06/02 10:05 PM
--------------------------------------------------------------------------------
Kent - Outage
Start time 06/02 06:43 PM
Est. restoration time To Be Determined
Customers impacted 26
Cause Car/Equip Accident
Status Waiting for repairs
Last updated 06/02 10:15 PM
--------------------------------------------------------------------------------
Kent - Outage
Start time 06/02 10:09 PM
Est. restoration time To Be Determined
Customers impacted 13
Cause Under Investigation
Status Repair crew onsite
Last updated 06/02 10:15 PM
--------------------------------------------------------------------------------
Northwest Bellevue - Outage
Start time 06/02 11:28 PM
Est. restoration time To Be Determined
Customers impacted 14
Cause Under Investigation
Status Repair crew onsite
Last updated 06/02 11:30 PM
--------------------------------------------------------------------------------
Pacific - Outage
Start time 06/02 06:19 PM
Est. restoration time 06/03 02:30 AM
Customers impacted 3
Cause Accident
Status Crew assigned
Last updated 06/02 11:00 PM
--------------------------------------------------------------------------------
Woodinville - Outage
Start time 06/02 08:29 PM
Est. restoration time 06/03 03:30 AM
Customers impacted 2
Cause Under Investigation
Status Crew assigned
Last updated 06/03 12:15 AM
--------------------------------------------------------------------------------
You are overthinking the problem. First there is no flexboard container. It's a simple case of assigning the right div class. You should be looking at div
class_=col-xs-12 col-sm-6 col-md-4 listView-container
from bs4 import BeautifulSoup
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.common.exceptions import TimeoutException
from time import sleep
# create object for chrome options
chrome_options = Options()
base_url = 'https://www.pse.com/outage/outage-map'
chrome_options.add_argument('disable-notifications')
chrome_options.add_argument('--disable-infobars')
chrome_options.add_argument('start-maximized')
chrome_options.add_argument('user-data-dir=C:\\Users\\username\\AppData\\Local\\Google\\Chrome\\User Data\\Default')
# To disable the message, "Chrome is being controlled by automated test software"
chrome_options.add_argument("disable-infobars")
# Pass the argument 1 to allow and 2 to block
chrome_options.add_experimental_option("prefs", {
"profile.default_content_setting_values.notifications": 2
})
# invoke the webdriver
browser = webdriver.Chrome(executable_path = r'C:/Users/username/Documents/playground_python/chromedriver.exe',
options = chrome_options)
browser.get(base_url)
delay = 5 #secods
while True:
try:
WebDriverWait(browser, delay)
print ("Page is ready")
sleep(5)
html = browser.execute_script("return document.getElementsByTagName('html')[0].innerHTML")
#print(html)
soup = BeautifulSoup(html, "html.parser")
for item_n in soup.find_all('div', class_='col-xs-12 col-sm-6 col-md-4 listView-container'):
for item_n_text in item_n.find_all(name="span"):
print(item_n_text.text)
except TimeoutException:
print ("Loading took too much time!-Try again")
# close the automated browser
browser.close()
Cause:
Accident
Status:
Crew assigned
Last updated:
06/02 11:00 PM
9. Woodinville
Start time:
06/02 08:29 PM
Est. restoration time:
06/03 03:30 AM
Customers impacted:
2
Cause:
Under Investigation
Status:
Crew assigned
Last updated:
06/03 12:15 AM
Page is ready
1. Bellingham
Start time:
06/02 06:09 PM
Est. restoration time:
06/03 06:30 AM
Customers impacted:
1
Cause:
Trees/Vegetation
Status:
Crew assigned
Last updated:
06/02 11:50 PM
2. Deming
Start time:
06/02 07:10 PM
Est. restoration time:
06/03 03:30 AM
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.