简体   繁体   中英

Open SSH connection in Python subprocess and avoid infinite loop in subprocess

I am executing a Python script (1) that has a subprocess to open a ssh connection and run another python script (2) on a machine. Currently I am waiting that the Python script (2) prints something to close the ssh connection in (1). That works good so far, but what I am worried about is what happens if the python scripts (2) gets trapped in an infinite loop without returning anything? That python scripts (2) is constantly under construction and will be appended a lot in future. So this might be a realistic scenario at some point. In that case the script (1) is running infinitely also and I have some kind of zombie ssh connection or how that is called. How can i avoid that? I tried to experiment with threading.Timer to kill the ssh connection after 10 seconds no matter what happens, but couldn't manage to make it work properly. Never worked with threading before. Just a side note. I do not run python scipt (1) manually and can't realy kill it directly.

Does anyone have a smart idea about how to solve this?

try:
    sshProcess = subprocess.Popen(['ssh', 
                            ...,
                            '/bin/bash'],
                            stdin=subprocess.PIPE, 
                            stdout = subprocess.PIPE,
                            universal_newlines=True,
                            bufsize=0)

    sshProcess.stdin.write("nohup python3 ... &")
    
    for line in sshProcess.stdout:
        if line == 'status: 1':
            print('status: 1')
            sshProcess.stdin.close()
            break
        if line == 'status: 0':
            print('status: 0')
            sshProcess.stdin.close()
            break
        else:
            print(line)
            sshProcess.stdin.close()
            break
except:
    print('error')

The Script that is supposed to be started looks like this:

import multiprocessing
import time


def main():
    try:
        process = parse_args()

        proc = multiprocessing.Process(target=do_something, args(process, ))
        proc.start()
        print('status: 1')
    except:
        print('status: 0')

def do_something(args)
    # creates some folders, copies some files, does some calculations...
    # calls a bunch of .sh scripts and other things in a subprocess
    time.sleep(10)


if __name__ == "__main__":
        main()

Both systems share a home directory over NFS.

Because both systems share a home directory over NFS, you don't need to stream output over the SSH connection at all.

logFileName, _ = subprocess.Popen(['ssh', 'user@host', 'bash -s'],
                                  stdin=subprocess.PIPE,
                                  stdout=subprocess.PIPE).communicate('''
    logFile="myProcess.$$.log"
    printf '%s\n' "$logFile"
    python3 otherProcess ... </dev/null >"$logFile" 2>&1 & disown -h "$!"
''')

...will, on success, put the name of a file that can be monitored to retrieve your program's status in the Python variable logFileName (as $$ will be replaced with the remote PID, and different for each instance).

Note that if you want content to be immediately available, you should probably add flush=True to your print() calls.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM