For processing a text file in bash line by line, I usually implement a while loop like this:
function doSomething() {
local inputFile="$1"
local fd=""
local line=""
exec {fd}<"$inputFile" # open file
echo "Opened ${inputFile} for read using descriptor ${fd}"
while IFS='' read -r -u $fd line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done
exec {fd}<&- # close file
return 0
}
This works on my Linux but unfortunately not in OSX. For OSX I currently have to change the code to something like this:
exec 3<"$inputFile" # open file
while IFS='' read -r -u 3 line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done
exec 3<&- # close file
But this has the disadvantage, that I have to manage the file descriptor numbers by myself (in the first script, I let bash choose an available file descriptor number).
Did someone have a solution for this which works for both Linux and OSX?
Note that for some reason, I don't want to use piping or I/O redirection to the complete loop like this (because I don't want to execute the loop in a different process):
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done < "$inputFile"
The last loop will not fork a new process. You can verify that by printing "$BASHPID"
in and outside of the loop.
New processes are only created for pipelines. Simple redirections are handled by temporary dup
s within the bash process.
Feel free to use standard stdin/stdout redirection. It's no more expensive than redirection done with the exec
builtin.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.