简体   繁体   English

龙卷风:如何为多个请求共享pymongo连接?

[英]Tornado: How to share pymongo connection for multiple requests?

I want to share a MongoDB connection for multiple requests. 我想为多个请求共享一个MongoDB连接。 This is what I have now, but looks like it's creating a new connection for each request. 这就是我现在所拥有的,但看起来它正在为每个请求创建一个新连接。

dbasync = asyncmongo.Client(pool_id='mydb', host='127.0.0.1', port=27017, maxcached=10, maxconnections=50, dbname='bench')

@route('/readAsync')
class ReadAllAsynchHandler(tornado.web.RequestHandler):
    @tornado.web.asynchronous
    def get(self):        
        print("getting sessions")
        dbasync["ss"].find({}, callback=self._on_response)

    def _on_response(self, response, error):
        print("on response: %s" % response)
        if error:
            raise tornado.web.HTTPError(500)
        self.finish(SS_TEMPLATE.generate(sessions=response))

When benchmarking with 1000 concurrent clients, I get these errors: 使用1000个并发客户端进行基准测试时,我收到以下错误:

Traceback (most recent call last):
  File "/home/ubuntu/envs/myproj/local/lib/python2.7/site-packages/tornado/web.py", line 1115, in _stack_context_handle_exception
    raise_exc_info((type, value, traceback))
  File "/home/ubuntu/envs/myproj/local/lib/python2.7/site-packages/tornado/web.py", line 1298, in wrapper
    result = method(self, *args, **kwargs)
  File "bench.py", line 29, in get
    dbasync["ss"].find({}, callback=self._on_response)
  File "/home/ubuntu/envs/myproj/local/lib/python2.7/site-packages/asyncmongo/cursor.py", line 380, in find
    connection = self.__pool.connection()
  File "/home/ubuntu/envs/myproj/local/lib/python2.7/site-packages/asyncmongo/pool.py", line 116, in connection
    raise TooManyConnections("%d connections are already equal to the max: %d" % (self._connections, self._maxconnections))
TooManyConnections: 50 connections are already equal to the max: 50

DEBUG:root:dropping connection. connection pool (10) is full. maxcached 10

The maxconnections parameter does not serve to buffer up requests to be reused by the existing connection pool. maxconnections参数不用于缓冲现有连接池重用的请求。 Rather, it just exists to make sure that, if desired, your application will not consume an unbounded amount of resources. 相反,它只是存在以确保,如果需要,您的应用程序将不会消耗无限量的资源。 For some more discussion of this behavior, see https://github.com/bitly/asyncmongo/pull/45 . 有关此行为的更多讨论,请参阅https://github.com/bitly/asyncmongo/pull/45 This pull request appears to provide the behavior that you desire. 此拉取请求似乎提供了您所需的行为。 You could install asyncmongo with his revisions using something like: 您可以使用以下内容安装asyncmongo及其修订版:

pip install git+git://github.com/ceymard/asyncmongo.git@7a8e6f6f446d71f8fd4f17de48994c0b6bee72ee

Alternatively, you may be able to limit the number of concurrent connections to your application somewhere in the Tornado settings, or else via, say, nginx (see HttpLimitConnModule) 或者,您可以在Tornado设置中的某个位置限制应用程序的并发连接数,或者通过nginx限制(请参阅HttpLimitConnModule)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM