简体   繁体   中英

How to limit the number of concurrent processes using subprocess module in asyncio python

import asyncio
import asyncio.subprocess
args="blah blah argument "     
create=asyncio.create_subprocess_shell(args,stdout=asyncio.subprocess.PIPE)
proc = await create
output= await proc.stdout.read( )

This is a part of my server code , which gets 1000s of parallel hits from clients.Now how should i limit the maximum number of subprocesses created by the server to run the argument blah blah . As this is code is using 100% of my cpu. I need to deploy other servers on smae cpu

asyncio.Semaphore is a way of limiting internal counter of simultaneous jobs:

sem = asyncio.Semaphore(10)

async def do_job(args):
    async with sem:  # Don't run more than 10 simultaneous jobs below
        proc = await asyncio.create_subprocess_shell(args, stdout=PIPE)
        output = await proc.stdout.read()
        return output

Note, you should be sure count of jobs doesn't increase much faster then you can actually do them. Otherwise, you'll need something more complex than that.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM