I need to download a list of files from each of the domains listed in another file. I've tried many times but I still failed.
The example list files to download (eg, file.txt):
1.jpg
2.jpg
3.jpeg
4.bmp
5.gif
The example list of domains (eg, url.lst):
google.com
google.co.in
google.com.br
The script:
#!/bin/bash
# Create an array files that contains list of filenames
urls=`cat "url.lst"`
files=`cat "file.txt"`
for ((file in "${files[@]}" && url in "${urls[@]}")); do
wget "${url}${file}"
done
I want to get it so it generates and runs the following commands:
wget google.com/1.jpg
wget google.com/2.jpg
wget google.com/3.jpeg
wget google.com/4.bmp
wget google.com/5.gif
wget google.co.in/1.jpg
wget google.co.in/2.jpg
wget google.co.in/3.jpeg
wget google.co.in/4.bmp
wget google.co.in/5.gif
wget google.com.br/1.jpg
wget google.com.br/2.jpg
wget google.com.br/3.jpeg
wget google.com.br/4.bmp
wget google.com.br/5.gif
You have a few things going on here. First, your reading the variables isn't creating arrays. You're getting a string that will be subject to word splitting and globbing and the like. Second, you will need to do the two file loops separately instead of trying to do it in a single command.
To fix the first part, I'd suggest using readarray
or mapfile
, for the second, use nested loops like:
readarray -t urls < url.lst
readarray -t files < file.txt
for dom in "${urls[@]}"; do
for path in "${files[@]}"; do
wget "$dom/$path"
done
done
or you could replace the outer for
loop with a while
loop and skip one of the readarray
s like
readarray -t files < file.txt
while read -r url; do
for path in "${files[@]}"; do
wget "$url/$path"
done
done < url.lst
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.