I created an alias in order not to write ls every time I move into a new directory:
alias cl='cd_(){ cd "$@" && ls; }; cd_'
Let us say I have a folder named "Downloads" (which of course I happen to have) so I just type the following in the terminal:
cl Downloads
Now I will find myself in the "Downloads" folder and receive a list of the stuff I have in the folder, like say: example.txt, hack.hs, picture.jpg,...
If I want to move to a directory and look if there is, say, hack.hs I could try something like this:
cl Downloads | grep hack
What I get is just the output:
hack.hs
But I will remain in the folder I was (which means I am not in Downloads).
I understand this happens because every command is executed in a subshell, and thus cd Downloads && ls is executed in a subshell of its own and then the output (namely the list of stuff I have) gets redirected via the pipe to grep. This is why I then am not in the new folder.
My question is the following:
How do I do it in order to be able to write something like "cl Downloads | grep hack" and get the "hack"-greped list of stuff AND be in the Downloads folder?
Thank you very much, Pol
For anyone ever googling this: A quick fix was proposed by @gniourf_gniourf :
cl Downloads > >(grep hack)
Some marked this question as a possible duplicate of Make bash alias that takes duplicates , but the fact that my bash alias already takes arguments shows that this is not the case. The problem at hand was about how to execute a command in the current shell while at the same time redirecting the output to another command.
As you're aware (and as is covered in BashFAQ #24 ), the reason
{ cd "$@" && ls; } | grep ...
...prevents the results of cd
being visible in the outer shell is that no component of a pipeline is guaranteed by POSIX to be run in the outer shell. (Some shells, including ksh [out-of-the-box] and very modern bash with non-default options enabled, will occasionally or optionally run the last piece of a pipeline in the parent shell, but this can't portably be relied on).
A way to avoid this, that's applicable to all POSIX shells, is to direct output to a named pipe, thus avoiding setting up a pipeline:
mkfifo mypipe
grep ... <mypipe &
{ cd "$@" && ls; } >mypipe
In modern ksh and bash, there's a shorter syntax that will do this for you -- using /dev/fd
entries instead of setting up a named pipe if the operating system provides that facility:
{ cd "$@" && ls; } > >(grep ...)
In this case, >(grep ...)
is replaced with a filename that points to either a FIFO or a /dev/fd
entry that, when written to by the process in question, redirects output to grep
-- but without a pipeline.
By the way -- I really do hope your use of ls
in this manner is as an example. The output of ls
is not well-specified for the range of all possible filenames, so grepping it is innately unreliable. Consider using printf '%s\\0' *
to emit a NUL-delimited list of non-hidden names in a directory, if you really do want to build a streamed result; or using glob expressions to check for files matching a specific pattern ( BashFAQ #4 covers a similar scenario); extglobs are available if you need something closer to full regex matching support than POSIX patterns support.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.