简体   繁体   中英

How can I selectively save stdout in a file only for one process?

To capture the output of a SSH command execution session in a txt file I did this:

sys.stdout = open('Output_Log.txt','w')

Now, once the SSH session is closed, I want to stop writing to the file and let the other print statements come to the console.

But that is not happening. The subsequent prints are overwriting my above log file and I am loosing my SSH command execution data.

Here is the complete code:

  session.exec_command('cd /project/neptune/neptune_psv/fw; ./Do_Regr.sh -i Testlist_Regression.in -m 135.24.237.198 -g')

  stdout_data = []
  stderr_data = []
  sys.stdout = open('Output_Log.txt','w')
  sys.stderr = open('Error_Log.txt','w')

  while True:

   try:
     if session.recv_ready():
      stdout_data.append(session.recv(16384))

     if session.recv_stderr_ready():
      stderr_data.append(session.recv_stderr(16384))

     if session.exit_status_ready():
      break

   except socket.timeout:
    print("SSH channel timeout exceeded.")
    break
   except Exception:
    traceback.print_exc()
   break

  print 'exit status: ', session.recv_exit_status()
  print ''.join(stdout_data)
  print ''.join(stderr_data)

  session.close()
  trans.close()
  print "############Regression Complete############"

When I open Output_Log.txt I only find the last print. Whereas, if I comment the last print statement (Regression Complete) then the Output_Log nicely captures the stdout_data of the session.

Instead of redirecting sys.stdout and sys.stderr , write to the files directly. You are already capturing the output in lists -- use that to your advantage:

try:
    # capture output streams
finally:
    with open('Output_Log.txt', 'w') as output:
        output.write(''.join(stdout_data))
        output.write('\nexit status: %s' % session.recv_exit_status())
    with open('Error_Log.txt', 'w') as output:
        output.write(''.join(stderr_data))

To save output of a ssh command in a file, don't redirect Python's stdout/stderr, use subprocess module instead:

from subprocess import call

cmd = ['ssh', 'host', 'cd /project/neptune/neptune_psv/fw; '
       './Do_Regr.sh -i Testlist_Regression.in -m 135.24.237.198 -g']
with open('Output_Log.txt', 'w') as out, open('Error_Log.txt','w') as err:
    rc = call(cmd, stdout=out, stderr=err)

Or if you want to continue using paramiko ; a minimal change to your code is to use out.write instead of stdout_data.append and err.write instead of stderr_data.append where out , err are corresponding file objects. Do not redirect sys.stdout , sys.stderr -- it is unnecessary here.

To avoid loosing data you should read possibly buffered data even if the subprocess is finished ie, read until EOF (until an empty result) after if session.exit_status_ready() .

Well, with the existing code I found one solution:

  stdout_data = []
  stderr_data = []
  temp = sys.stdout      --------- Added this
  sys.stdout = open('Output_Log.txt','w')
  sys.stderr = open('Error_Log.txt','w')

  ---------------Main Code---------------------

  print 'exit status: ', session.recv_exit_status()
  print ''.join(stdout_data)
  print ''.join(stderr_data)

  session.close()
  trans.close()
  sys.stdout.close()
  sys.stdout = temp      ------ Added this

  print "############Regression Complete############"

This works good for me. But I will try both the solutions suggested above and reply later.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM