简体   繁体   中英

Scrapy : How to pass list of arguments through command prompt to spider?

Creating a scraper for fantasy team. Looking for a way to pass a list of the players names as arguments, and then for each player_name in player_list run the parsing code.

I currently have something like this

class statsspider(BaseSpider):
name = 'statsspider'

def __init__ (self, domain=None, player_list=""):
    self.allowed_domains = ['sports.yahoo.com']
    self.start_urls = [
        'http://sports.yahoo.com/nba/players',
    ]
    self.player_list= "%s" % player_list


def parse(self, response):
    example code
    yield request

I'm assuming entering a list of arguments is the same as just one argument through the command line so I enter something like this:

scrapy crawl statsspider -a player_list=['xyz','abc']

Problem 2!

Solved the first issue by inputting a comma delimited list of arguments like so

scrapy crawl statsspider -a player_list="abc def,ghi jkl"

I now want to go through each "name" (ie 'abc def') to find the first initial of their last name (in this case 'd').

I use the code

array = []
for player_name in self.player_list:
    array.append(player_name)
print array

And I end up with the result [["'",'a','b','c',... etc]] Why does python not assign player_name to each 'name' (eg 'abc def' and 'ghi jkl')? can someone explain this logic to me, and I will probably understand the right way to do it afterwards!

Shell arguments are string-based. You need to parse arg in your code.

command line:

scrapy crawl statsspider -a player_list=xyz,abc

python code:

self.player_list = player_list.split(',')

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM