简体   繁体   中英

using proxy with scrapy-splash

I'm trying to use proxy (proxymesh) alongside scrapy-splash. I have following (relevant) code

PROXY = """splash:on_request(function(request)
    request:set_proxy{
        host = http://us-ny.proxymesh.com,
        port = 31280,
        username = username,
        password = secretpass,
    }
    return splash:html()
end)"""

and in start_requests

def start_requests(self):
    for url in self.start_urls:
        print url
        yield SplashRequest(url, self.parse,
            endpoint='execute',
            args={'wait': 5,
                  'lua_source': PROXY,
                  'js_source': 'document.body'},

But it does not seem to work. self.parse is not called at all. If I change endpoint to 'render.html' I hit the self.parse method, but when I inspect headers (response.headers) I can see that it is not going trough proxy. I confirmed that when I set http://checkip.dyndns.org/ as starting url and saw, upon parsing response, my old ip address.

What am I doing wrong?

You should add 'proxy' argument to SplashRequest object.

def start_requests(self):
    for url in self.start_urls:
        print url
        yield SplashRequest(url, self.parse,
            endpoint='execute',
            args={'wait': 5,
                  'lua_source': PROXY,
                  'js_source': 'document.body',
                  'proxy': 'http://proxy_ip:proxy_port'}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM