繁体   English   中英

如何确保在网络抓取到 CSV 时数据匹配?

[英]How to make sure the data is matching while web-scraping to CSV?

我从 DESWATER 网站提取数据,然后将这些数据保存在 CSV 中。举个例子说明这个问题,我有这 2 位作者,一位有全文文件,另一位没有。 因此,它会将文件保存给错误的作者。

所以 CSV output 看起来像这样:

Authors        | File
First Author   | Second File
Second Author  | Third File

但我想要这样的 output:

Authors        | File
First Author   | 'No File'
Second Author  | Second File
Third Author   | Third File

这是一个小测试代码:

from bs4 import BeautifulSoup
import requests
import time
import csv

list_of_authors = []
list_of_full_file = []

r = requests.get('https://www.deswater.com/vol.php?vol=1&oth=1|1-3|January|2009')
# Parsing the HTML
soup = BeautifulSoup(r.content, 'html.parser')

#'Author'
s = soup.find('td', class_='testo_normale')
authors = s.find_all('i')
for author in authors:
    list_of_authors.append(author.text.strip())
    time.sleep(1)


#'FULL TEXT'
# find all the anchor tags with "href"
n=1
for link in soup.find_all('a', class_='testo_normale_rosso'):
    if "fulltext.php?abst=" in link.get('href'):
        # TO ADD
        baseurl = 'https://www.deswater.com/'
        Full_links=baseurl+link.attrs['href'].replace('\n','')
        list_of_full_file.append(f'file {n}')
        n+=1            
        time.sleep(1) 

def Save_csv():
    row_head =['Author', 'File Name']
    Data = []
    for author, file in zip(list_of_authors, list_of_full_file):
        Data.append(author)
        Data.append(file)
    rows = [Data[i:i + 2] for i in range(0, len(Data), 2)]

    with open('data.csv', 'w', encoding='utf_8_sig', newline="") as csvfile:
        csvwriter = csv.writer(csvfile)
        csvwriter.writerow(row_head)
        csvwriter.writerows(rows)

Save_csv()

此代码最终将从 279 页中提取数据,因此我需要代码自动检测该作者没有全文,因此我可以 append 将其作为“无文件”

请在此处查看网站中正确匹配的参考。 第一作者没有全文文件。 有任何想法吗?

如果您不能确保相同的长度,请尝试更改选择元素的策略并避免使用多个列表。

在此处使用css selectors到 select 所有<hr>是所有其他选择的基础find_previous()

for e in soup.select('.testo_normale hr'):
    data.append({
                'author': e.find_previous('i').text,
                'file': 'https://www.deswater.com/'+e.find_previous('a').get('href') if 'fulltext' in e.find_previous('a').get('href') else 'no url'
            })

例子

from bs4 import BeautifulSoup
import requests
import csv

soup = BeautifulSoup(requests.get('https://www.deswater.com/vol.php?vol=1&oth=1|1-3|January|2009').content)

with open('data.csv', 'w', encoding='utf-8', newline='') as f:
    
    data = []

    for e in soup.select('.testo_normale hr'):
        data.append({
            'author': e.find_previous('i').text,
            'file': 'https://www.deswater.com/'+e.find_previous('a').get('href') if 'fulltext' in e.find_previous('a').get('href') else 'no url'
        })

    dict_writer = csv.DictWriter(f, data[0].keys())
    dict_writer.writeheader()
    dict_writer.writerows(data)

Output

author,file
Miriam Balaban,no url
W. Richard Bowen,https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzEucGRm&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kW.k@13@kRichardk@13@kBowenk@1@kk@4@kik@2@kk@1@kbrk@2@kWaterk@13@kengineeringk@13@kfork@13@kthek@13@kpromotionk@13@kofk@13@kpeacek@1@kbrk@2@k1k@15@k2009k@16@k1k@35@k6k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@12@k1.pdfk@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2NC9URFdUX0FfMTA1MTI4NjRfTy5wZGY=&type=1
Steven J. Duranceau,https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzcucGRm&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kStevenk@13@kJ.k@13@kDuranceauk@1@kk@4@kik@2@kk@1@kbrk@2@kModelingk@13@kthek@13@kpermeatek@13@ktransientk@13@kresponsek@13@ktok@13@kperturbationsk@13@kfromk@13@ksteadyk@13@kstatek@13@kink@13@kak@13@knanofiltrationk@13@kprocessk@1@kbrk@2@k1k@15@k2009k@16@k7k@35@k16k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@12@k7.pdfk@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2NS9URFdUX0FfMTA1MTI4NjVfTy5wZGY=&type=1
"Dmitry Lisitsin, David Hasson, Raphael Semiat",https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzE3LnBkZg==&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kDmitryk@13@kLisitsink@6@kk@13@kDavidk@13@kHassonk@6@kk@13@kRaphaelk@13@kSemiatk@1@kk@4@kik@2@kk@1@kbrk@2@kModelingk@13@kthek@13@keffectk@13@kofk@13@kantik@35@kscalantk@13@konk@13@kCaCO3k@13@kprecipitationk@13@kink@13@kcontinuousk@13@kflowk@1@kbrk@2@k1k@15@k2009k@16@k17k@35@k24k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@12@k17.pdfk@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2Ni9URFdUX0FfMTA1MTI4NjZfTy5wZGY=&type=1
"M.A. Darwish, Fatima M. Al-Awadhi, A. Akbar, A. Darwish",https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzI1LnBkZg==&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kM.A.k@13@kDarwishk@6@kk@13@kFatimak@13@kM.k@13@kAlk@35@kAwadhik@6@kk@13@kA.k@13@kAkbark@6@kk@13@kA.k@13@kDarwishk@1@kk@4@kik@2@kk@1@kbrk@2@kAlternativek@13@kprimaryk@13@kenergyk@13@kfork@13@kpowerk@13@kdesaltingk@13@kplantsk@13@kink@13@kKuwaitk@32@kk@13@kthek@13@knucleark@13@koptionk@13@kIk@1@kbrk@2@k1k@15@k2009k@16@k25k@35@k41k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@12@k25.pdfk@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2Ny9URFdUX0FfMTA1MTI4NjdfTy5wZGY=&type=1
...

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM