简体   繁体   English

Scrapy将多余的数据从csv文件传递到解析

[英]Scrapy pass extra data from csv file into parse

My scrapy spider looks through a csv file and runs start_urls with the address in the csv file like so: 我的小蜘蛛会浏览一个csv文件,并使用该csv文件中的地址运行start_urls,如下所示:

 from csv import DictReader
   with open('addresses.csv') as rows:
     start_urls=['http://www.example.com/search/?where='+row["Address"].replace(',','').replace(' ','+') for row in DictReader(rows)]

But the .csv file also contains emails and other information. 但是.csv文件还包含电子邮件和其他信息。 How can I pass this extra information into the parse to add it to the new file? 如何将这些额外信息传递到解析中,以将其添加到新文件中?

import scrapy
from csv import DictReader

with open('addresses.csv') as rows:
  names=[row["Name"].replace(',','') for row in DictReader(rows)]
  emails=[row["Email"].replace(',','') for row in DictReader(rows)]
  start_urls=['http://www.example.com/search/?where='+row["Address"].replace(',','').replace(' ','+') for row in DictReader(rows)]

 def parse(self,response):
   yield{
     'name': FROM CSV,
     'email': FROM CSV,
     'address' FROM SCRAPING: 
     'city' FROM SCRAPING: 
    }
import scrapy
from csv import DictReader

class MySpider(scrapy.Spider):

    def start_requests(self):

        with open('addresses.csv') as rows:

            for row in DictReader(rows):

                name=row["Name"].replace(',','')
                email=row["Email"].replace(',','')

                link = 'http://www.example.com/search/?where='+row["Address"].replace(',','').replace(' ','+')

                yield Request(url = link, 
                        callback = self.parse, 
                        method = "GET", 
                        meta={'name':name, 'email':email}
                    )


    def parse(self,response):
        yield{
         'name': resposne.meta['name'],
         'email': respose.meta['email'],
         'address' FROM SCRAPING: 
         'city' FROM SCRAPING: 
        }
  • Open your CSV file. 打开您的CSV文件。
  • Iterate over it inside start_requests method. start_requests方法中对其进行迭代。
  • Pass parameters to callback function, use meta variable, you can pass a Python Dictionary in meta . 将参数传递给回调函数,使用meta变量,您可以在meta传递Python字典。

Note: Remember that start_requests is not my custom defined method, its Python Scrapy's method. 注意:请记住, start_requests不是我的自定义定义方法,而是它的Python Scrapy方法。 See https://doc.scrapy.org/en/latest/topics/spiders.html#scrapy.spiders.Spider.start_requests 请参阅https://doc.scrapy.org/en/latest/topics/spiders.html#scrapy.spiders.Spider.start_requests

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM