简体   繁体   中英

How can I pipe to the next command with new environment variables?

I have a shell script that declares some variables:

export X=1
export Y=2

and I'd like to be able to do something like this:

. ./path/to/script | command_that_has_access_to_X_and_Y

Basically, source the script somehow, so that the command following the pipe could access those variables. Is such a thing possible?

EDIT: One of the commands I'd like to run is pg_dump, and the credentials are in a shell file:

Basically I'm trying to run this:

bash -c "pg_dump \$PRODUCTION_DB --password \$PRODUCTION_PASSWORD --user \$PRODUCTION_USERNAME --host \$PRODUCTION_HOST > #{backup_name}.sql"

There's no need to use a pipe. Assuming you have the export commands in the first script, those variables will be available to the second script.

. ./path/to/script
command_that_has_access_to_X_and_Y

A pipeline is simply a tool for connecting the standard input of one script to the standard output of another. It's an efficient alternative to using a temporary file; x | y x | y is more-or-less the same as

x > tmp.txt
y < tmp.txt

except the operating systems handles the details of passing text from x to y so that both can run at the same time, with y receiving input as x produces it.

You could send by echo instead:

Script on the left:

#!/bin/bash
X='Something X'
Y='Something Y'
echo "$X"
echo "$Y"

Command on the right:

... | bash -c "read -r X; read -r Y; echo \"\$X : \$Y\""

Produces:

Something X : Something Y

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM