简体   繁体   中英

bash: check if multiple files in a directory contain strings from a list

Folks,

I have a text file which contains multiple lines with one string per line :

str1
str2
str3

etc..

I would like to read every line of this file and then search for those strings inside multiple files located in a different directory.

I am not quite sure how to proceed.

Thanks very much for your help.

Use the GNU Grep's --file Option

According to grep(1):

   -f FILE, --file=FILE
          Obtain  patterns  from  FILE,  one  per  line.   The  empty file
          contains zero patterns, and therefore matches nothing.   (-f  is
          specified by POSIX.)

The -H and -n flags will print the filename and line number of each match. So, assuming you store your patterns in /tmp/foo and want to search all files in /tmp/bar, you could use something like:

# Find regular files with GNU find and grep them all using a pattern
# file.
find /etc -type f -exec grep -Hnf /tmp/foo {} +
for wrd in $(cut -d, -f1 < testfile.txt); do grep $wrd dir/files* ; done
awk 'NR==FNR{a[$0];next} { for (word in a) if ($0 ~ word) print FILENAME, $0 }' fileOfWords /wherever/dir/*
while read -r str
do
   echo "$str"
   grep "$str" /path/to/other/files
done < inputfile

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM