[英]Delete the first 10 largest regular files using shell script
I'm trying to delete the first largest regular files from the given directory, but it doesn't work for files which contain whitespace caracters.我正在尝试从给定目录中删除第一个最大的常规文件,但它不适用于包含空格字符的文件。
My code (it works if the files doesn't contain whitespace caracters):我的代码(如果文件不包含空格字符,它可以工作):
find mydir -type f -exec du -ahb {} + | sort -n -r | cut -f2 | head -n 10 | xargs rm -i
I also tried this , but it gives an error message:我也试过这个,但它给出了一条错误消息:
find mydir -type f -exec du -ahb {} + -print 0 | sort -n -r | cut -f2 | head -n 10 | xargs -0 rm -i
The following should work at least with GNU coreutils 8.25 and newer:以下内容至少应适用于 GNU coreutils 8.25 及更高版本:
find mydir -type f -exec du -0b {} + | sort -znr | cut -zf2 | head -zn 10 | xargs -0pn 1 rm
I made sure every command handled and outputted NUL bytes ( \0
) separated records rather than linefeed separated records:我确保每个命令都处理并输出 NUL 字节(
\0
)分隔的记录,而不是换行分隔的记录:
du
outputs NUL-separated records with -0
du
以-0
输出 NUL 分隔的记录sort
, cut
and head
handle and output NUL-separated records with -z
sort
, cut
和head
句柄和 output NUL 分隔的记录与-z
xargs
handles NUL-separated records with -0
xargs
使用-0
处理 NUL 分隔的记录Additionally, I removed the interactive mode of rm
and asked xargs
to handle that instead ( -p
), because xargs
didn't provide a prompt to rm
when invoking it.此外,我删除了
rm
的交互模式并要求xargs
来处理它( -p
),因为xargs
在调用它时没有向rm
提供提示。 I had to limit the number of parameters given at once to rm
to 1 for this to work ( xargs
' -n 1
parameter).我必须将一次给定的参数数量限制为
rm
为 1 才能工作( xargs
' -n 1
参数)。 There might be a way to preserve the -i
and provide rm
with an interface to your prompt, but I don't know how.可能有一种方法可以保留
-i
并为rm
提供提示的接口,但我不知道如何。
Last point: I removed du
's -h
uman-readable mode because it would have made the sort often fail and it didn't serve any purpose since the filesizes were never displayed to an human.最后一点:我删除了
du
的-h
uman-readable 模式,因为它会使排序经常失败并且它没有任何用途,因为文件大小从未显示给人类。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.