I need to check file structure in directory tree every second, but my script is taking too long as I suspect collecting timestamps and calculating base64 from it's name takes some time in large directory tree.
How can I gather tree structure, then store them in array, and loop through each file collecting their timestamp in array, where the array key is a cipher from file path?
Later I'm comparing the array to older version and that adds additional wasted time, so I would like to improve the efficiency.
shopt -s globstar dotglob
files=("$base"/**)
new() {
keys=("${!files_new[@]}")
for i in "${keys[@]::2}"
do
unset "files_new[$i]"
done
for file in "${files[@]}"
do
stamp=$(stat -c "%Y" "$file")
hash=$(echo "$file" | base64)
files_new[$hash]=$stamp
done
}
I have achieved faster probing by using Find
command with -printf
output as suggested by Peter Cordes. The loop went from 3 seconds to about 300 milliseconds.
loop() {
while read string
do
stamp=${string%|*}
file=${string#*|}
files_new[$file]=$stamp
done < <(find "${base}" -name "*" -printf "%TD%TT|%p\n")
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.