简体   繁体   中英

Asynchronous reading of multiple files in Tornado

I am trying to build a tiny status monitor for a server that broadcasts the information to a series of clients over a WebSocket. To that end, I am reading the output of several commands using tornado.process.Subprocess as well as various files under the /proc/ directory. I would like to know how I can asynchronously read the output of the different commands asynchronously, updating the dictionary of values that the WebSocket channels will broadcast to the clients.

I tried using gen.coroutine and yield ing an array with all the DummyFuture objects that each Subprocess call returns, unfortunately to no avail. Here is a simplified version of my code:

def get_data(*args, **kwargs):
    response_dict = {}
    fd_uname = process.Subprocess("uname -r", shell=True, stdout=process.Subprocess.STREAM).stdout

    f_uname = fd_uname.read_until_close() # A DummyFuture is generated
    fd_uptime = process.Subprocess("uptime | tail -n 1 | awk '{print $3 $4 $5}'", shell=True, stdout=subprocess.PIPE).stdout
    f_uptime.read_until_close()

    # In the end, the results will be stored as entries of response_dict 

data_dict = {}
def process_data():
    result = get_data() # The goal is to run this function asynchronously
    data_dict = result

open_ws = set()
class WebSocketIndexHandler(websocket.WebSocketHandler):
    def open(self):
        open_ws.add(self)
        self.callback = PeriodicCallback(self.send_data, 1000)
        self.callback.start()
        start_callback()

    def send_data(self):
        self.write_message(data_dict)

    def on_close(self):
        self.callback.stop()
        open_ws.remove(self)

ProcessCallback(get_data, 1000)

I thought of using the callback parameter of read_until_close as a solution, assigning another callback parameter to get_data ) that would be called when all other futures resolve successfully, but I find that solution rather cumbersome.

Thanks in advance!

To call a coroutine from another coroutine you need either "async def" and "await" in Python 3.5+, or else "gen.coroutine" and "yield". Here's the modern syntax:

async def get_data(*args, **kwargs):
    response_dict = {}
    fd_uname = process.Subprocess("uname -r", shell=True, stdout=process.Subprocess.STREAM).stdout

    uname_result = await fd_uname.read_until_close()
    fd_uptime = process.Subprocess("uptime | tail -n 1 | awk '{print $3 $4 $5}'", shell=True, stdout=subprocess.PIPE).stdout
    uptime_result = await f_uptime.read_until_close()

    # In the end, the results will be stored as entries of response_dict 
    return response_dict

async def process_data():
    result = await get_data() # The goal is to run this function asynchronously
    # Now do something with the result....

Make sure you import subprocess from tornado, of course, not from the standard library.

For more information, see the my Refactoring Tornado Coroutines or the Tornado coroutine guide .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM