简体   繁体   English

Python 3.4 asyncio任务没有完全执行

[英]Python 3.4 asyncio task doesn't get fully executed

I'm experimenting with Python 3.4's asyncio module. 我正在试验Python 3.4的asyncio模块。 Since there's no production ready package for MongoDB using asyncio, I have written a small wrapper class that execute all mongo queries in an executor. 由于没有使用asyncio的MongoDB生产就绪包,我编写了一个小包装类,它在执行程序中执行所有mongo查询。 This is the wrapper: 这是包装器:

import asyncio
from functools import wraps
from pymongo import MongoClient


class AsyncCollection(object):
    def __init__(self, client):
        self._client = client
        self._loop = asyncio.get_event_loop()

    def _async_deco(self, name):
        method = getattr(self._client, name)

        @wraps(method)
        @asyncio.coroutine
        def wrapper(*args, **kwargs):
            print('starting', name, self._client)
            r = yield from self._loop.run_in_executor(None, method, *args, **kwargs)
            print('done', name, self._client, r)
            return r

        return wrapper

    def __getattr__(self, name):
        return self._async_deco(name)


class AsyncDatabase(object):
    def __init__(self, client):
        self._client = client
        self._collections = {}


    def __getitem__(self, col):
        return self._collections.setdefault(col, AsyncCollection(self._client[col]))


class AsyncMongoClient(object):
    def __init__(self, host, port):
        self._client = MongoClient(host, port)
        self._loop = asyncio.get_event_loop()
        self._databases = {}

    def __getitem__(self, db):
        return self._databases.setdefault(db, AsyncDatabase(self._client[db]))

I want to execute inserts asynchronously, meaning that the coroutine that executes them doesn't want to wait for the execution to complete. 我想异步执行插入,这意味着执行它们的协同程序不希望等待执行完成。 asyncio manual states that A task is automatically scheduled for execution when it is created. The event loop stops when all tasks are done. asyncio手动指出, A task is automatically scheduled for execution when it is created. The event loop stops when all tasks are done. A task is automatically scheduled for execution when it is created. The event loop stops when all tasks are done. , So I constructed this test script: ,所以我构建了这个测试脚本:

from asyncdb import AsyncMongoClient
import asyncio

@asyncio.coroutine
def main():
    print("Started")
    mongo = AsyncMongoClient("host", 27017)
    asyncio.async(mongo['test']['test'].insert({'_id' : 'test'}))
    print("Done")

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

When I run the script I get the following result: 当我运行脚本时,我得到以下结果:

Started
Done
starting insert Collection(Database(MongoClient('host', 27017), 'test'), 'test')

There should be a line indicating that the mongo query is done. 应该有一行表示mongo查询已完成。 I can see that line when I yield from this coroutine instead of running it using asyncio.async . 当我yield from这个协程中产生而不是使用asyncio.async运行它时,我可以看到该行。 However, what's really odd is that the test entry actually exists in MongoDB when I run this corouting using asyncio.async , so despite the fact that it seems to work, I don't understand why can't I see the print statement indicating that the query has been preformed. 然而,真正奇怪的是,当我使用asyncio.async运行这个asyncio.async ,测试条目实际上存在于MongoDB中,所以尽管它看起来有效,但我不明白为什么我看不到print语句表明查询已经执行。 Despite the fact that I run the event loop using run_until_completed , it should wait for the insert task to complete, even if the main coroutine finished before. 尽管我使用run_until_completed运行事件循环,但它应该等待插入任务完成,即使主协程已经完成。

asyncio.async(mongo...)) just schedules the mongo query. asyncio.async(mongo...))只是安排 mongo查询。 And run_until_complete() doesn't wait for it. 并且run_until_complete()不会等待它。 Here's code example that shows it using asyncio.sleep() coroutine: 这是使用asyncio.sleep() coroutine显示它的代码示例:

#!/usr/bin/env python3
import asyncio
from contextlib import closing
from timeit import default_timer as timer

@asyncio.coroutine
def sleep_BROKEN(n):
    # schedule coroutine; it runs on the next yield
    asyncio.async(asyncio.sleep(n))

@asyncio.coroutine
def sleep(n):
    yield from asyncio.sleep(n)

@asyncio.coroutine
def double_sleep(n):
    f = asyncio.async(asyncio.sleep(n))
    yield from asyncio.sleep(n) # the first sleep is also started
    yield from f

n = 2
with closing(asyncio.get_event_loop()) as loop:
    start = timer()
    loop.run_until_complete(sleep_BROKEN(n))
    print(timer() - start)
    loop.run_until_complete(sleep(n))
    print(timer() - start)
    loop.run_until_complete(double_sleep(n))
    print(timer() - start)

Output 产量

0.0001221800921484828
2.002586881048046
4.005100341048092

Output shows that run_until_complete(sleep_BROKEN(n)) returns in less than 2 milliseconds instead of 2 seconds. 输出显示run_until_complete(sleep_BROKEN(n))在不到2毫秒而不是2秒内返回。 And run_until_complete(sleep(n)) works as it should: it returns in 2 seconds. 并且run_until_complete(sleep(n))按原样工作:它在2秒内返回。 double_sleep() shows that coroutines scheduled by async.async() are run on yield from (two concurrent sleeps are in parallel) ie, it sleep 2 seconds, not 4. If you add a delay (without allowing the event loop to run) before the first yield from then you see that yield from f doesn't return sooner ie, asyncio.async doesn't run the coroutines; double_sleep()表明,定协程async.async()是在运行yield from (两个并发睡在平行),即,它睡2秒,没有4.如果你添加一个延迟(不使事件循环运行)在从那时起第一次yield from之前,你会看到yield from f不会很快返回,即asyncio.async不会运行协同程序; it only schedules them to run. 它只安排它们运行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM