I have a process myProcess1
that produces both stdout and stderr outputs. I want to pipe the two output streams into two different downstream processes, myProcess2
and myProcess3
, that will massage the data and then dump the results into two different files. Is it possible to do it with a single command? If not, the 2nd best would be running two separate commands, one to process stdout
, the other stderr
. In this case, the first run would simply be:
myProcess1 | myProcess2 > results-out.txt
What would be a similar command to process stderr
? Thx
Without fancy games something like this should work:
{ myProcess1 | myProcess2 > results-out.txt; } 2>&1 | myprocess3 > results-err.txt
With fancy games (which do not work in /bin/sh
, etc.) you could do something like this:
myProcess1 2> >(myprocess3 > results-err.txt) | myProcess2 > results-out.txt
A named pipe or two would work well for this, but you'll probably find it easiest to do with multiple terminal windows.
In one terminal window:
mknod stdout.pipe p
mknod stderr.pipe p
myProcess1 > stdout.pipe 2> stderr.pipe
In a second window:
myProcess2 < stdout.pipe
And in a third:
myProcess3 < stderr.pipe
Using background processes and some other tricks, you could do it all in one window, but keeping the output of your secondary processes separated would be another challenge...
You could eliminate one of the named pipes (and thus one of the extra windows) with either of these:
myProcess1 2> stderr.pipe | myProcess2
or
{ myProcess1 > stdout.pipe; } 2>&1 | myProcess3
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.