简体   繁体   中英

Downloading files with Makefile rules

I was planning something like:

URLS=www.host.com/file1.tar.gz www.host2.com/file2.tar.gz
$(somefunc $URLS): #somefunc produces downloads/file1.tar.gz downloads/file2.tar.gz
   mkdir -P downloads
   wget whatever # I can't get the real url here because the targets don't contain the full url anymore

myproject: $(somefunc URLS)
   #Files should already be in downloads/ here

The problem I have is that if I convert URLS with somefunc I lose the urls, if I don't I can't use it as a target to avoid being downloaded when it is already there.

Any ideas?

If somefunc only modifies the path, not the actual filename, and there are no duplicates, you could try searching $(URLS) for the original filename.

Maybe something like this? (untested)

$(somefunc $URLS): #somefunc produces downloads/file1.tar.gz downloads/file2.tar.gz
   mkdir -p $(dir $@)
   wget $(filter $(addprefix %/,$(notdir $@)),$(URLS)) -O $@
  • $(notdir $@) evaluates to file1.tar.gz for one of the targets.
  • $(addprefix %/,file1.tar.gz) evaluates to %/file1.tar.gz
  • $(filter %/file1.tar.gz,www.host.com/file1.tar.gz www.host2.com/file2.tar.gz) evaluates to www.host.com/file1.tar.gz .

(Presumably you want an http:// on there too?)

At this point, I don't think you can do it directly with make. But here's a solution if you are willing to use some scripting:

$(URLS):
    @for url in ${URLS}; do \                                               
        if [ ! -e $$(somefunc $${url}) ]; then \
            echo wget $${url}; \
            wget $${url}; \
        fi \
    done

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM