简体   繁体   中英

How can I download all the files from a remote directory to my local directory?

I want to download all the files in a specific directory of my site.

Let's say I have 3 files in my remote SFTP directory

www.site.com/files/phone/2017-09-19-20-39-15

  • a.txt
  • b.txt
  • c.txt

My goal is to create a local folder on my desktop with ONLY those downloaded files. No parents files or parents directory needed. I am trying to get the clean report.

I've tried

wget -m --no-parent -l1 -nH -P ~/Desktop/phone/ www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off

I got

在此处输入图片说明

I want to get

在此处输入图片说明

How do I tweak my wget command to get something like that?

Should I use anything else other than wget ?

Ihue,

Taking a shell programatic perspective I would recommend you try the following command line script, note I also added the citation so you can see the original threads.

wget -r -P ~/Desktop/phone/ -A txt www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off

-r enables recursive retrieval. See Recursive Download for more information.

-P sets the directory prefix where all files and directories are saved to.

-A sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list. See Types of Files for more information.

Ref: @don-joey https://askubuntu.com/questions/373047/i-used-wget-to-download-html-files-where-are-the-images-in-the-file-stored

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM