[英]How to replace a string in multiple files in multiple subfolders with different file extensions in linux using command line
I have already followed this query @ ( How to replace a string in multiple files in linux command line ).我已经关注了这个查询@( How to replace a string in multiple files in multiple files in linux command line )。
My question is rather an extension of the same.我的问题是相同的延伸。
I want to check only specific file extensions in the subfolders also but not every file extension.我只想检查子文件夹中的特定文件扩展名,而不是每个文件扩展名。
What I have already tried:我已经尝试过的:
grep -rli 'old-word' * | xargs -i@ sed -i 's/old-word/new-word/g' @
My problem: It is changing in every other file format as well.我的问题:它也在所有其他文件格式中发生变化。 I want to search and replace only in one file extension.我只想在一个文件扩展名中搜索和替换。
Please add another answer where I can change the entire line of a file as well not just one word.请添加另一个答案,我可以在其中更改文件的整行,而不仅仅是一个单词。
Thanks in advance.提前致谢。
Simplest solution is to use complex grep
command:最简单的解决方案是使用复杂grep
命令:
grep -rli --include="*.html" --include=".json" 'old-word' * grep -rli --include="*.html" --include=".json" '旧词' *
The disadvantage of this solution.这种解决方案的缺点。 Is that you do not have clear control which files are scanned.是您没有明确控制扫描哪些文件。
find
command to locate your desired files.更好地建议调整find
命令以找到所需的文件。 Using RegExp filtering option -regex
to filter file names.使用 RegExp 过滤选项-regex
过滤文件名。
So you verify the correct files are scanned.因此,您验证扫描了正确的文件。
Than feed the find
command result to grep
scanning list.然后将find
命令结果提供给grep
扫描列表。
Assuming you are looking for file extensions txt
pdf
html
.假设您正在寻找文件扩展名txt
pdf
html
。
Assuming your search path begins in /home/user/data
假设您的搜索路径从/home/user/data
开始
find /home/user/data -regex ".*\.\(html\|txt\|pdf\)$"
Once you have located your files.找到文件后。 It is possible to grep
match each file from the the above find
command: grep
可以匹配上述find
命令中的每个文件:
grep -rli 'old-word' $( find /home/user/data -regex ".*\.\(html\|txt\|pdf\)$" )
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.