简体   繁体   English

Paramiko exec_command挂在docker exec上

[英]Paramiko exec_command hangs on docker exec

I am using Paramiko to test docker commands from an external system (I need to do this I can't just build the container and test it locally) and the test case that I am trying to run involves starting up Apache Spark and running one of the examples, specifically SparkPi. 我正在使用Paramiko从外部系统测试docker命令(我需要这样做,我不能只是构建容器并在本地对其进行测试),而我尝试运行的测试用例涉及启动Apache Spark并运行以下其中之一这些示例,特别是SparkPi。 For some reason my python script hangs on the docker exec ... command below. 由于某种原因,我的python脚本挂在下面的docker exec ...命令上。 However, previously perform other docker execs and have not had a problem running everything manually. 但是,以前执行其他docker exec并没有遇到任何手动运行一切的问题。 It only breaks when I put everything in the script. 仅当我将所有内容放入脚本中时,它才会中断。

Command : 命令

stdin, stdout, stderr = ssh_client.exec_command(f'docker exec {spark_container_id} bash -c \'"$SPARK_HOME"/bin/spark-submit --class org.apache.spark.examples.SparkPi \
                --master spark://$(hostname):7077 "$SPARK_HOME"/examples/jars/spark-examples_2.11-2.1.1.jar {self.slices_to_calculate}\'')
print("\nstdout is:\n" + stdout.read() + "\nstderr is:\n" + stderr.read()) 

Any idea what could be causing this? 知道是什么原因造成的吗? And why? 又为什么呢?

Found out that the reason for this is because I didn't have the get_pty=True parameter for exec_command . 发现,这样做的原因是因为我没有足够的get_pty=True为参数exec_command It must be the case that by attaching a terminal to the spark-submit command the output gets printed properly. 在某些情况下,必须通过将端子连接到spark-submit命令来正确打印输出。 So the solution to this would be 所以解决方案是

stdin, stdout, stderr = ssh_client.exec_command(f'docker exec -t {spark_container_id} bash -c \'"$SPARK_HOME"/bin/spark-submit ...', get_pty=True)

NOTE : By using get_pty=True the stdout and stderr of the exec_command get combined. 注意 :通过使用get_pty=True ,可以将exec_commandstdoutstderr结合在一起。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM