[英]find the URL after button click from the website using selenium python
[英]Dont open URL after .click() selenium Python
我正在使用 selenium 和 BeautifulSoup 從表單中抓取數據。 第一步是在搜索字段中提交條目。 第二步是從新加載的表單中抓取數據。 這兩個步驟都是可行的。
編輯:當腳本發送條目( send_keys() )並單擊提交按鈕( submit.click() )時,網頁加載。 我希望網頁在后台加載,所以我看不到它。
這是代碼:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from bs4 import BeautifulSoup
searchterm = "DE431311903710"
url = 'http://eagri.cz/public/web/mze/zemedelstvi/zivocisna-vyroba/zivocisne-komodity/kone/centralni-pristupove-misto-pro-evidenci.html'
driver = webdriver.Firefox()#, executable_path=r'C:\Utility\BrowserDrivers\chromedriver.exe')
driver.get(url)
driver.implicitly_wait(50)
## You have to switch to the iframe like so: ##
driver.switch_to.frame(driver.find_element_by_tag_name("iframe"))
## Insert text via xpath ##
elem = driver.find_element_by_xpath("/html/body/div/form/div[3]/div/div[2]/table/tbody/tr[2]/td/table/tbody/tr[2]/td[2]/input")
elem.send_keys(searchterm)
submit = driver.find_element_by_xpath("//*[@id=\"btnVyhledat\"]")
submit.click()
p = BeautifulSoup(driver.page_source, features = "html.parser")
l = []
k = []
inputs = p.find_all('span',{"class":"editprvek"})
inputs2 = p.find_all("span",{"class":"editpopis"})
for i in inputs:
l.append(i.text)
for j in inputs2:
k.append(j.text)
def merge(list1,list2):
merged = [(list1[i], list2[i]) for i in range(0, len(list1))]
return merged
print(merge(k,l))
如果我理解了問題提交搜索后,加載表格數據需要時間。
在submit.click()
后添加這一行 引入WebDriverWait
() 並等待visibility_of_element_located
()
WebDriverWait(driver,30).until(EC.visibility_of_element_located((By.CSS_SELECTOR,".editprvek")))
導入以下庫。
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.