简体   繁体   English

我可以使用wget从linux终端下载多个文件吗

[英]Can i use wget to download multiple files from linux terminal

Suppose i have a directory accessible via http e,g 假设我有一个可通过http e,g访问的目录

Http://www.abc.com/pdf/books

Inside the folder i have many pdf files 在文件夹内,我有很多pdf文件

Can i use something like 我可以使用类似的东西吗

wget http://www.abc.com/pdf/books/ *

wget -r -l1 -A.pdf http://www.abc.com/pdf/books

from wget man page: 从wget手册页:

  Wget can follow links in HTML and XHTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as ``recursive downloading.'' While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing. 

and

  Recursive Retrieval Options 
   -r
   --recursive
       Turn on recursive retrieving.

   -l depth
   --level=depth
       Specify recursion maximum depth level depth.  The default maximum depth is 5.

It depends on the webserver and the configuration of the server. 它取决于Web服务器和服务器的配置。 Strictly speaking the URL is not a directory path, so the http://something/books/* is meaningless. 严格来说,URL不是目录路径,因此http://something/books/*是没有意义的。

However if the web server implements the path of http://something/books to be a index page listing all the books on the site, then you can play around with the recursive option and spider options and wget will be happy to follow any links which is in the http://something/books index page. 但是,如果Web服务器将http://something/books的路径实现为列出该站点上所有书籍的索引页,则可以使用递归选项和Spider选项,wget将很乐意跟随任何链接在http://something/books索引页面中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM