简体   繁体   中英

Couldn’t get all links by python

I need the links near by src-0, src-1, src-2… src-59. How could I fix the coding? Thanks


!pip install selenium
from selenium import webdriver
import time
import pandas as pd

browser = webdriver.Chrome(executable_path='./chromedriver.exe')
browser.implicitly_wait(5)
browser.get("https://tw.mall.yahoo.com/store/%E5%B1%88%E8%87%A3%E6%B0%8FWatsons:watsons")


# 商品連結

linkPath = "//ul[@class='gridList']/li/a"
product_links = browser.find_elements_by_xpath(linkPath)
print(product_links)



The xpath you are using does not highlight any elements in the DOM.

Try like below, hope these are the links you are trying to extract.

# Imports Required

from selenium import webdriver
from selenium.webdriver.common.by import By

driver.get("https://tw.mall.yahoo.com/store/%E5%B1%88%E8%87%A3%E6%B0%8FWatsons:watsons")

links = driver.find_elements(By.XPATH,"//section[contains(@class,'MainListing__StoreBoothWrap')]/div/div/div/ul/li/a")
print(len(links))

for link in links:
    print(link.get_attribute("href"))
60
https://tw.mall.yahoo.com/item/p0330231079018
https://tw.mall.yahoo.com/item/p0330221397264
https://tw.mall.yahoo.com/item/p0330201617111
...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM