[英]Python: subprocess with pipes, exit on failure
我試圖阻止上傳到S3,以防任何先前的流水線命令失敗,遺憾的是這兩種方法都沒有按預期工作:
殼管道
for database in sorted(databases):
cmd = "bash -o pipefail -o errexit -c 'mysqldump -B {database} | gpg -e -r {GPGRCPT} | gof3r put -b {S3_BUCKET} -k {database}.sql.e'".format(database = database, GPGRCPT = GPGRCPT, S3_BUCKET = S3_BUCKET)
try:
subprocess.check_call(cmd, shell = True, executable="/bin/bash")
except subprocess.CalledProcessError as e:
print e
與PIPE打交道
for database in sorted(databases):
try:
cmd_mysqldump = "mysqldump {database}".format(database = database)
p_mysqldump = subprocess.Popen(shlex.split(cmd_mysqldump), stdout=subprocess.PIPE)
cmd_gpg = "gpg -a -e -r {GPGRCPT}".format(GPGRCPT = GPGRCPT)
p_gpg = subprocess.Popen(shlex.split(cmd_gpg), stdin=p_mysqldump.stdout, stdout=subprocess.PIPE)
p_mysqldump.stdout.close()
cmd_gof3r = "gof3r put -b {S3_BUCKET} -k {database}.sql.e".format(S3_BUCKET = S3_BUCKET, database = database)
p_gof3r = subprocess.Popen(shlex.split(cmd_gof3r), stdin=p_gpg.stdout, stderr=open("/dev/null"))
p_gpg.stdout.close()
except subprocess.CalledProcessError as e:
print e
我嘗試過這樣的事,沒有運氣:
....
if p_gpg.returncode == 0:
cmd_gof3r = "gof3r put -b {S3_BUCKET} -k {database}.sql.e".format(S3_BUCKET = S3_BUCKET, database = database)
p_gof3r = subprocess.Popen(shlex.split(cmd_gof3r), stdin=p_gpg.stdout, stderr=open("/dev/null"))
p_gpg.stdout.close()
...
基本上gof3r
即使存在錯誤也會將數據流傳輸到S3,例如當我故意更改mysqldump
- > mysqldumpp
以生成錯誤時。
管道中的所有命令同時運行,例如:
$ nonexistent | echo it is run
即使nonexistent
命令,也始終運行echo
。
pipefail
影響整個管道的退出狀態 - 它不會使gof3r
更快退出 errexit
沒有效果,因為這里有一個管道。 如果你的意思是,你不想開始下一個管道 ,如果從以前的迭代中一個失敗然后把break
后print e
中的異常處理程序。
當gpg
正在運行時, p_gpg.returncode is None
。 如果你不希望gof3r
在gpg
失敗的gof3r
下運行,那么你必須首先將gpg的輸出保存在其他地方,例如,在文件中:
filename = 'gpg.out'
for database in sorted(databases):
pipeline_no_gof3r = ("bash -o pipefail -c 'mysqldump -B {database} | "
"gpg -e -r {GPGRCPT}'").format(**vars())
with open(filename, 'wb', 0) as file:
if subprocess.call(shlex.split(pipeline_no_gof3r), stdout=file):
break # don't upload to S3, don't run the next database pipeline
# upload the file on success
gof3r_cmd = 'gof3r put -b {S3_BUCKET} -k {database}.sql.e'.format(**vars())
with open(filename, 'rb', 0) as file:
if subprocess.call(shlex.split(gof3r_cmd), stdin=file):
break # don't run the next database pipeline
我有完全相同的問題,我用以下方法管理:
cmd = "cat file | tr -d '\\n'"
subprocess.check_call( [ '/bin/bash' , '-o' , 'pipefail' , '-c' , cmd ] )
回想一下,在我的代碼中搜索,我也使用了另一種方法:
subprocess.check_call( "ssh -c 'make toto 2>&1 | tee log.txt ; exit ${PIPESTATUS[0]}'", shell=True )
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.