I am trying :
1. wget -i url.txt
and
2. wget -O output.ext
How do I join both? Download urls listed in url.txt and save them with the names I specify, as seperate files.
Define all the URLs in url.txt and give this a try to see if this is what you need:
for url in $(cat url.txt); do wget $url -O $url.out ; done
If your URLs consist of one or more URIs, this would replace slash with underscore:
for url in $(cat url.txt); do wget $url -O $(echo $url | sed "s/\//_/g").out ; done
In this situation, i think, you need two files with the same number of lines, to map each url with a corresponding name:
url.txt
(source file containing your urls, example content given here):
https://svn.apache.org/repos/asf/click/trunk/examples/click-spring-cayenne/README.txt
https://svn.apache.org/repos/asf/click/trunk/examples/click-spring-cayenne/README.txt
output_names.txt
(filenames you want to assign):
readme1.txt
readme2.txt
Then you iterate over both files and pass the contents to wget
, eg with the following script
:
#!/bin/bash
IFS=$'\n' read -d '' -r -a url < "$1"
IFS=$'\n' read -d '' -r -a output < "$2"
len=${#url[@]}
for ((i=0;i<$len;i++))
do
wget "${url[$i]}" -O "${output[$i]}"
done
Call:
./script url.txt output_names.txt
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.