I'm trying to run an R script that uses RSelenium to scrape a website and download a file.
I've managed to do this on my local machine just fine.
I've been struggling however to do the same on a remote Linux server.
Code:
# To start up Docker Container
system('sudo docker run -d --rm --name selenium_container -v /home/Downloads:/home/seluser/Downloads -p 4445:4444 -p 5900:5900 selenium/standalone-chrome:3.14')
# To initiate driver in R "x.x.x.x" being the server IP address (have used 'localhost' as well.
remDr <- RSelenium::remoteDriver(remoteServerAddr = "x.x.x.x",
port = 4445L,
browserName = "chrome")
Expected result from clicking on download button:
File should download to /home/Downloads and also be in docker container in /home/seluser/Downloads
Actual result:
No files in those folders.
Expected problem:
Read/Write permissions of some sort?
Fixed my issue:
It was a Read/Write permissions issue.
Fixed it with: chmod 777 -R /Downloads/Location
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.