简体   繁体   中英

download files using bash script using wget

I've been trying to create a simple script that will take a list of file to be downloaded from a .txt file, then using a loop it will read the .txt what files needs to be downloaded with the help of the other separated .txt file where in the address of the files where it will be downloaded. But my problem is I don't know how to do this. I've tried many times but I always failed.

file.txt
1.jpg
2.jpg
3.jpg
4.mp3
5.mp4

=====================================

url.txt
url = https://google.com.ph/

=====================================

download.sh
#!/bin/sh
url=$(awk -F = '{print $2}' url.txt)
for i in $(cat file.txt);
do 
wget $url
done

Your help is greatly appreciated.

Other than the obvious issue that R Sahu pointed out in his answer, you can avoid:

  • Using awk to parse your url.txt file.
  • Using for $(cat file.txt) to iterate through file.txt file.

Here is what you can do:

#!/bin/bash

# Create an array files that contains list of filenames
files=($(< file.txt))

# Read through the url.txt file and execute wget command for every filename
while IFS='=| ' read -r param uri; do 
    for file in "${files[@]}"; do 
        wget "${uri}${file}"
    done
done < url.txt

Instead of

wget $url

Try

wget "${url}${i}"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM