[英]Python JSON RPC server with ability to stream
I have come across several guides and packages on implementing a python JSON RPC server, eg: 我在实现python JSON RPC服务器时遇到了几个指南和包,例如:
They all do a good job in the sense that the server/application implementation is very simple, you just return the python object as a result and the framework takes care of serializing it. 在服务器/应用程序实现非常简单的意义上,它们都做得很好,你只需返回python对象,框架就会对它进行序列化。 However, this is not suitable for my needs mainly because I am looking forward to serializing possibly thousands of records from database and such a solution would require me to create a single python object containing all the records and return that as the result.
但是,这不适合我的需求主要是因为我期待从数据库中序列化可能数千条记录,这样的解决方案需要我创建一个包含所有记录的python对象并将其作为结果返回。
The ideal solution I am looking for would involve a framework that would provide the application a stream to write the response to and a JSON encoder that could encode an iterator (in this case a cursor from pyodbc) on the fly, something like this: 我正在寻找的理想解决方案将涉及一个框架,它将为应用程序提供一个流来编写响应,以及一个JSON编码器,它可以动态编码迭代器(在本例中是一个来自pyodbc的游标),如下所示:
def process(self, request, response):
// retrieve parameters from request.
cursor = self.conn.cursor()
cursor.execute(sql) // etc.
// Dump the column descriptions and the results (an iterator)
json.dump(response.getOut(), [cursor.description, cursor])
Can someone point me to a server framework that can provide me a stream to write to and a json serialization framework that can handle an iterable such as the pyodbc cursor and serialize it on the fly. 有人可以指向一个服务器框架,它可以为我提供一个流写入和一个json序列化框架,可以处理一个迭代,如pyodbc游标,并在运行时序列化它。
如果典型的JSON-RPC框架不允许你有效地转储如此庞大的数据,为什么不只是使用HTTP服务器并返回json数据,这样你就可以流式传输和读取流式数据,好的一点是你甚至可以为它进行gzip更快的传输,你也可以使用许多标准服务器,例如apache。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.