简体   繁体   中英

Bash script to download JPG files from the website

I am a completely beginner in programming bash scripts (Linux, Debian), one of the use that I'd like to write is to make user able to download JPG or any other kind of graphic files from website he needs to, eg ./code1.sh www.bbc.com . It doesn't interpret a website adress, an error shows: there is no such a file or catalog like www.bbc.com . Could you please write me what am I doing wrong?

#!/bin/bash
$1
curl $1 | grep -o -e `$1.\{1,100\}\.jpg` > graph_list.txt
wget `cat graph_list.txt`

Try this line with a simplified regular expression. This will get all the pictures (I'm not sure if I fully understand how you wanted to filter):

curl www.bbc.com | grep -o -e 'http[^"]*\.jpg' | xargs wget

Your code posted in the question cannot work because of the backticks ( ` ) you used to wrap your regular expression. Backticks trigger command substitution in Bash like you intentionally did in your last line, use quotes ( ' ) to prevent tokenization and expansion or double quotes ( " ) to prevent tokenization instead.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM