简体   繁体   中英

How to curl all files recursively

Here's the code I'm currently using:

xargs -n 1 curl -s -o /dev/null -w "%{http_code} - %{url_effective}\n" < 'file.txt'

This works for curling all the URLs that are in file.txt and getting the status code and the URL that was curled. However, I need to do this for every file recursively.

I've tried a few things, like this, but none of them worked:

xargs -n 1 curl -s -o /dev/null -w "%{http_code} - %{url_effective}\n" < *

There's about 3000 files in multiple directories. Is there any way to do this recursively?

Assume that your file list's name has a rule like this: file.txt, file1.txt, file2.txt under the current working directory including the subdirectories.

You could merge all the links in those files with cat **/file*.txt command, then combine it with the xargs :

cat **/file*.txt | xargs -n 1 curl -s -o /dev/null -w "%{http_code} - %{url_effective}\n"

Good luck!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM