简体   繁体   中英

Run Scrapy Web Crawler with Selenium in Linux Server

I developed a web crawler by scrapy and selenium(python). and it can run successfully on my local. I am curiosity could i upload my whole crawler project into my linux server, and run it like on the local???

Only one concern is on my local, when the program ran, it will open the browser and imitate the action like human being, but in the linux server, like you know, there is no browser we can open.

So Could we do that???

You can use a virtual display to run a headless X-server.

Install first the packages if are not already:

sudo apt-get install xvfb python-pip
sudo pip install pyvirtualdisplay

And add the following code before starting a Selenium webdriver:

from pyvirtualdisplay import Display
display = Display(visible=0, size=(800, 600))
display.start()

Then close the driver at the end of the execution:

display.stop()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM