简体   繁体   中英

How to store subprocess call output to a file python

i have a script in bash that i am running in python using call from subprocess. i want the script output (strings) to be stored in a file. i tried working on a code that is

import logging
logging.basicConfig(level=logging.DEBUG,
                    format='',
                    datefmt='',
                    filename='E:\FYP\FYPPP\AMAPT\log.txt',
                    filemode='w')=
console = logging.StreamHandler()
console.setLevel(logging.INFO)
formatter = logging.Formatter('')
console.setFormatter(formatter)
logging.getLogger('').addHandler(console)
logging.info(call("sh amapt.sh", shell=True))
logger1 = logging.getLogger('myapp.area1')
logger2 = logging.getLogger('myapp.area2')
logger1.debug('test.')
logger1.info('test1.')
logger2.warning('test2.')
logger2.error('test3.') 

But displays the output as an integer of my script. check the image here I want the yellow/green output text to be stored in file but it is storing 255 instead of this text from my script.

Guide me please.

call returns the return code of your script. You need to get the output of the process:

logging.info(check_output("sh amapt.sh"))

notes:

  • if process fails you'll get an exception
  • you don't need shell=True

To get the output no matter what you could use Popen instead for instance like this:

logging.info(Popen("sh amapt.sh",stdout=PIPE,stderr=STDOUT).stdout.read())

The added ,stderr=STDOUT allows to get standard error in the log as well.

or from python 3.5 you can use subprocess.run() instead of Popen

logging.info(run("sh amapt.sh",stdout=PIPE,stderr=STDOUT).stdout.read())

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM