[英]strange problem with running bash-scripts from python in docker
I have python-script, which run bash-scripts via subprocess library.我有 python-script,它通过子进程库运行 bash-scripts。 I need to collect stdout and stderr to files, so I have wrapper like:我需要将 stdout 和 stderr 收集到文件中,所以我有如下包装:
def execute_chell_script(stage_name, script): subprocess.check_output('{} &>logs/{}'.format(script, stage_name), shell=True)
And it works correct when I launch my python script on mac.当我在 mac 上启动我的 python 脚本时它工作正常。 But If I launch it in docker-container ( FROM ubuntu:18.04
) I cant see any log-files.但是如果我在 docker-container ( FROM ubuntu:18.04
) 中启动它,我看不到任何日志文件。 I can fix it if I use bash -c 'command &>log_file'
instead of just command &>log_file
inside subprocess.check_output(...)
.如果我在subprocess.check_output(...)
使用bash -c 'command &>log_file'
而不仅仅是command &>log_file
我可以修复它。 But it looks like too much magic.但这看起来太神奇了。
I thought about the default shell for user, which launches python-script (its root), but cat /etc/passwd
shows root ... /bin/bash
.我想到了用户的默认 shell,它启动 python-script(它的根),但cat /etc/passwd
显示root ... /bin/bash
。
It would be nice if someone explain me what happened.如果有人向我解释发生了什么,那就太好了。 And maybe I can add some lines to dockerfile to use the same python-script inside and outside docker-container?也许我可以在 dockerfile 中添加一些行以在 docker-container 内部和外部使用相同的 python 脚本?
As the OP reported in a comment that this fixed their problem, I'm posting it as an answer so they can accept it.正如 OP 在评论中报告的那样,这解决了他们的问题,我将其发布为答案,以便他们接受。
Using check_output
when you don't get expect any output is weird;当您没有期望任何输出时使用check_output
很奇怪; and requiring shell=True
here is misdirected.并在这里要求shell=True
是误导。 You want你要
with open(os.path.join('logs', stage_name)) as output:
subprocess.run([script], stdout=ouput, stderr=output)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.