简体   繁体   中英

BeautifulSoup doesn't find tables on webpage

I'm trying to get the data from the 1st table on a website. I've looked on here for similar problems and tried a number of the given solutions but can't seem to find the table and ultimately the data in the table.

I've tried:

from bs4 import BeautifulSoup  
from selenium import webdriver  
driver = webdriver.Chrome('C:\\folder\\chromedriver.exe')  
url = 'https://docs.microsoft.com/en-us/windows/release-information/'  
driver.get(url)  

tbla = driver.find_element_by_name('table') #attempt using by element name  
tblb = driver.find_element_by_class_name('cells-centered') #attempt using by class name  
tblc = driver.find_element_by_xpath('//*[@id="winrelinfo_container"]/table[1]') #attempt by using xpath  

and tried using beautiful soup

html = driver.page_source
soup = BeautifulSoup(html,'html.parser')
table = soup.find("table", {"class": "cells-centered"})
print(len(table))

Any help is much appreciated.

Table is present inside an iframe you need to switch iframe first to access the table .

Induce WebDriverWait() and wait for frame_to_be_available_and_switch_to_it () and following locator.

Induce WebDriverWait() and wait for visibility_of_element_located () and following locator.

driver.get("https://docs.microsoft.com/en-us/windows/release-information/")
WebDriverWait(driver,10).until(EC.frame_to_be_available_and_switch_to_it((By.ID,"winrelinfo_iframe")))
table=WebDriverWait(driver,10).until(EC.visibility_of_element_located((By.CSS_SELECTOR,"table.cells-centered")))

You need to import below libraries.

from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC

Or you use below code with xpath .

driver.get("https://docs.microsoft.com/en-us/windows/release-information/")
WebDriverWait(driver,10).until(EC.frame_to_be_available_and_switch_to_it((By.ID,"winrelinfo_iframe")))
table=WebDriverWait(driver,10).until(EC.presence_of_element_located((By.XPATH,'//*[@id="winrelinfo_container"]/table[1]')))

You can import further your table data to pandas dataframe and then export to csv file.You need to import pandas.

driver.get("https://docs.microsoft.com/en-us/windows/release-information/")
WebDriverWait(driver,10).until(EC.frame_to_be_available_and_switch_to_it((By.ID,"winrelinfo_iframe")))
table=WebDriverWait(driver,10).until(EC.presence_of_element_located((By.XPATH,'//*[@id="winrelinfo_container"]/table[1]'))).get_attribute('outerHTML')
df=pd.read_html(str(table))[0]
print(df)
df.to_csv("path/to/csv")

Import pandas: pip install pandas

Then add the below library

import pandas as pd

The table is located inside <iframe> , so BeautifulSoup doesn't see it inside original page:

import requests 
from bs4 import BeautifulSoup


url = 'https://docs.microsoft.com/en-us/windows/release-information/'
soup = BeautifulSoup(requests.get(url).content, 'html.parser')
soup = BeautifulSoup(requests.get(soup.select_one('iframe')['src']).content, 'html.parser')

for row in soup.select('table tr'):
    print(row.get_text(strip=True, separator='\t'))

Prints:

Version Servicing option    Availability date   OS build    Latest revision date    End of service: Home, Pro, Pro Education, Pro for Workstations and IoT Core End of service: Enterprise, Education and IoT Enterprise
2004    Semi-Annual Channel 2020-05-27  19041.546   2020-10-01  2021-12-14  2021-12-14  Microsoft recommends
1909    Semi-Annual Channel 2019-11-12  18363.1110  2020-09-16  2021-05-11  2022-05-10
1903    Semi-Annual Channel 2019-05-21  18362.1110  2020-09-16  2020-12-08  2020-12-08
1809    Semi-Annual Channel 2019-03-28  17763.1490  2020-09-16  2020-11-10  2021-05-11
1809    Semi-Annual Channel (Targeted)  2018-11-13  17763.1490  2020-09-16  2020-11-10  2021-05-11
1803    Semi-Annual Channel 2018-07-10  17134.1726  2020-09-08  End of service  2021-05-11

...and so on.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM