简体   繁体   中英

Python tornado server loses the connections

I have a python http server with tornado framework. After several requests, it becomes unavailable. In the browser page is not available too. After about 20 seconds of inactivity, it starts to work again.

100,000 queries contain about 10 exceptions. At this load the server process consumes about 30% CPU.

Why server becomes unavailable?

Server:

start_port = 4400
workers = 1

class MainHandler(tornado.web.RequestHandler):
    def get(self):
        data = ''.join(random.choice(string.ascii_uppercase + string.digits) for _ in range(1000))
        self.write(data)


def server_process(port):
    application = tornado.web.Application([
        (r"/", MainHandler),
    ])

    http_server = tornado.httpserver.HTTPServer(application)

    http_server.listen(port)
    tornado.ioloop.IOLoop.instance().start()


if __name__ == "__main__":
    for i in xrange(workers):
        port = start_port + i
        print 'process started on %d port' % port
        p = Process(target=server_process, args=(port,))
        p.start()

Client:

def f():
    for i in xrange(500000):
        try:
            r = requests.get('http://127.0.0.1:4400')
            if i % 100 == 0:
                print i, str(r.text)
        except:
            print traceback.format_exc()
            time.sleep(5)


if __name__ == '__main__':
    for j in xrange(1):
        p = Process(target=f)
        p.start()

Traceback:

Traceback (most recent call last):
  File "/home/me/PycharmProjects/test/client.py", line 16, in f
    if i % 100 == 0:
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
    return request('get', url, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 383, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 486, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 378, in send
    raise ConnectionError(e)
ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=4400): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 99] Cannot assign requested address)

Updated:

Experimentally picked a value of 300 requests per second the hardware server (not process). This value does not depend on the number of processes running tornado. Adding nginx as proxy server does not helped.

Server was running on ubuntu server 12.04 and linux mint 16. It looks like this limitation depends on debian operation system.

Problem solved. Multiprocessing.Process is a bad idea for several tornado processes. I am using tornado.process instead.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM