简体   繁体   English

无法从正在运行的进程中读取标准输出

[英]Unable to read stdout from a running process

I have read and tried 8 different methods answered on several questions about this. 我已经阅读并尝试了8种不同的方法来回答有关此问题的几个问题。 I opened a process in python and want to read its output even if the process hasn't terminated yet. 我在python中打开了一个进程,并希望读取其输出,即使该进程尚未终止。 The process doesn't complete usually for at least 1 minute or until an interrupt is sent. 该过程通常至少要等一分钟或发送中断后才能完成。 No matter what I try, I can't get it to read the output. 不管我尝试什么,都无法读取输出。 I know the command and args I passed work because when I change it to subprocess.call(cmd, args), it prints everything to the screen. 我知道我传递的命令和参数,因为当我将其更改为subprocess.call(cmd,args)时,它将所有内容打印到屏幕上。 I also checked that the process is running with ps -ax. 我还检查了该进程是否正在使用ps -ax运行。 Here is an example of what I'm trying (cat /dev/random is UNRELATED to my project): 这是我正在尝试的示例(cat / dev / random与我的项目无关):

proc = subprocess.Popen(["cat", "/dev/random"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print("Process started.")

Here is what I have tried that has failed so far: 到目前为止,我尝试过的失败了:

for line in iter(p.stdout.readline, ''):
    strLine = str(line).rstrip()
    print(">>> " + strLine )
    sys.stdout.flush()

And

output, error = proc.communicate()
print output

And

while proc.poll() is None:
    print("Still waiting.")
    print(proc.stdout.readline(1))

There are more solutions I tried that are variations of this but no luck. 我尝试了更多解决方案,这些都是这些的变体,但是没有运气。 When the call function is used without changing stdout, everything prints correctly to the console. 在不更改stdout的情况下使用call函数时,所有内容都会正确打印到控制台。 What am I doing wrong? 我究竟做错了什么?

I'm using Python 2.6. 我正在使用Python 2.6。

I copied your code into a complete function-and-file, adding one change ( repr ) to avoid printing stuff that changes terminal titles and such, giving: 我将您的代码复制到一个完整的函数和文件中,添加了一个更改( repr )以避免打印会更改终端标题等的内容,并给出:

import subprocess
import sys

def tst():
    proc = subprocess.Popen(["cat", "/dev/random"],
        stdout=subprocess.PIPE, stderr=subprocess.PIPE)
    print("Process started.")
    for line in iter(p.stdout.readline, ''):
        strLine = str(line).rstrip()
        print(">>> " + repr(strLine))
        sys.stdout.flush

tst()

(oops, looks like my cut-and-paste dropped the parentheses on sys.stdout.flush! harmless in this case though) (糟糕,看起来像是我剪切粘贴掉了sys.stdout.flush上的括号!不过在这种情况下是无害的)

Running this immediately produces the obvious error: 立即运行此命令会产生明显的错误:

Process started.
Traceback (most recent call last):
  File "foo.py", line 13, in <module>
    tst()
  File "foo.py", line 8, in tst
    for line in iter(p.stdout.readline, ''):
NameError: global name 'p' is not defined

Fixing that (replacing p with proc ), the example works, for some definition of "works": /dev/random does not stop producing output so it runs forever. 为解决此问题(用proc替换p ),该示例适用于“ works”的某些定义:/ dev / random不会停止产生输出,因此它将永远运行。

The middle example will be a problem since proc.communicate() is going to read the entire output of the process, which is infinite and hence will run you out of memory (eventually). 中间的示例将是一个问题,因为proc.communicate()将读取进程的整个输出,该输出是无限的,因此最终将耗尽内存。 :-) :-)

The third example works fine. 第三个示例工作正常。

If you replace the cat /dev/random with something else, you may discover a more interesting and perhaps annoying aspect of Unix/Linux pipelines: a process's stdout stream is normally line buffered if and only if it goes to an "interactive device" (like a terminal window). 如果将cat /dev/random替换为其他内容,则可能会发现Unix / Linux管道的一个更有趣甚至令人讨厌的方面:进程的stdout流通常在且仅当它进入“交互设备”时才被行缓冲(例如终端窗口)。 A pipe is not an "interactive device", so stdout is block-buffered, unless the command in question overrides this itself. 管道不是“交互设备”,因此stdout是块缓冲的,除非相关命令本身覆盖了该命令。 This may be the root of the problem I can't reproduce here. 这可能是我在这里无法重现的问题的根源。

You can work around this by using pseudo-ttys instead of (or in addition to) Python's subprocess module. 您可以通过使用伪ttys代替(或除此之外)Python的subprocess模块来解决此问题。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法在具有通道和exec_command的paramiko中长时间运行的进程读取stdout和stderr - Unable to read stdout and stderr for a long running process in paramiko with channels and exec_command 在python中长时间运行的子进程上写入stdin并从stdout读取 - write to stdin and read from stdout on long-running child process in python 重复写入stdin并从python中读取进程的stdout - Repeatedly write to stdin and read from stdout of a process from python 如何使用子进程和Popen从长时间运行的进程中返回stdout? - How to return stdout from long running process with subprocess and Popen? 如何将print和stdout重定向到管道并从父进程读取它? - How to redirect print and stdout to a pipe and read it from parent process? Python:如何从另一个进程读取stdout非阻塞? - Python: How to read stdout non blocking from another process? 无法使用 Python 从 Windows 上的衍生子进程读取标准输出 - Unable to read stdout from spawned subprocess on Windows with Python 在正在运行的进程中刷新 paramiko 中的标准输出 - Refresh stdout in paramiko inside a running process 如何在Python中读取正在进行的进程的stdout输出? - How to read stdout output of ongoing process in Python? 如何在 python 中广泛和/或交替地从 stdout 和/或 stderr 读取并写入进程的 stdin? - How to wildly and/or alternatingly read from stdout and/or stderr and write to stdin of a process in python?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM