简体   繁体   English

通过HTTP2提供Python(Flask)REST API

[英]Serving Python (Flask) REST API over HTTP2

I have a Python REST service and I want to serve it using HTTP2. 我有一个Python REST服务,我想使用HTTP2来提供服务。 My current server setup is nginx -> Gunicorn . 我当前的服务器设置是nginx -> Gunicorn In other words, nginx (port 443 and 80 that redirects to port 443) is running as a reverse proxy and forwards requests to Gunicorn (port 8000, no SSL). 换句话说,nginx(重定向到端口443的端口443和80)作为反向代理运行,并将请求转发给Gunicorn(端口8000,没有SSL)。 nginx is running in HTTP2 mode and I can verify that by using chrome and inspecting the 'protocol' column after sending a simple GET to the server. nginx在HTTP2模式下运行,我可以通过使用chrome验证这一点,并在向服务器发送简单的GET后检查'protocol'列。 However, Gunicorn reports that the requests it receives are HTTP1.0. 但是,Gunicorn报告它收到的请求是HTTP1.0。 Also, I coulnt't find it in this list: https://github.com/http2/http2-spec/wiki/Implementations So, my questions are: 另外,我在这个列表中找不到它: https//github.com/http2/http2-spec/wiki/Implementations所以,我的问题是:

  • Is it possible to serve a Python (Flask) application with HTTP2? 是否可以使用HTTP2提供Python(Flask)应用程序? If yes, which servers support it? 如果是,哪些服务器支持它?
  • In my case (one reverse proxy server and one serving the actual API), which server has to support HTTP2? 在我的情况下(一个反向代理服务器和一个服务实际的API),哪个服务器必须支持HTTP2?

The reason I want to use HTTP2 is because in some cases I need to perform thousands of requests all together and I was interested to see if the multiplexed requests feature of HTTP2 can speed things up. 我想使用HTTP2的原因是因为在某些情况下我需要一起执行数千个请求,我有兴趣看看HTTP2的多路复用请求功能是否可以加快速度。 With HTTP1.0 and Python Requests as the client, each request takes ~80ms which is unacceptable. 使用HTTP1.0和Python请求作为客户端,每个请求大约需要80毫秒,这是不可接受的。 The other solution would be to just bulk/batch my REST resources and send multiple with a single requests. 另一种解决方案是批量/批量处理我的REST资源,并通过单个请求发送多个。 Yes, this idea sounds just fine, but I am really interested to see if HTTP2 could speed things up. 是的,这个想法听起来不错,但我真的很想知道HTTP2是否可以加快速度。

Finally, I should mention that for the client side I use Python Requests with the Hyper http2 adapter. 最后,我要提一下,对于客户端,我使用带有Hyper http2适配器的Python请求。

Is it possible to serve a Python (Flask) application with HTTP/2? 是否可以使用HTTP / 2提供Python(Flask)应用程序?

Yes, by the information you provide, you are doing it just fine. 是的,根据您提供的信息,您做得很好。

In my case (one reverse proxy server and one serving the actual API), which server has to support HTTP2? 在我的情况下(一个反向代理服务器和一个服务实际的API),哪个服务器必须支持HTTP2?

Now I'm going to tread on thin ice and give opinions. 现在我要踏上薄冰并发表意见。

The way HTTP/2 has been deployed so far is by having an edge server that talks HTTP/2 (like ShimmerCat or NginX). 到目前为止,部署HTTP / 2的方式是使用一个与HTTP / 2通信的边缘服务器(如ShimmerCat或NginX)。 That server terminates TLS and HTTP/2, and from there on uses HTTP/1, HTTP/1.1 or FastCGI to talk to the inner application. 该服务器终止TLS和HTTP / 2,并从那里使用HTTP / 1,HTTP / 1.1或FastCGI与内部应用程序通信。

Can, at least theoretically, an edge server talk HTTP/2 to web application? 至少在理论上,边缘服务器可以将HTTP / 2与Web应用程序通信吗? Yes, but HTTP/2 is complex and for inner applications, it doesn't pay off very well. 是的,但是HTTP / 2很复杂,对于内部应用程序来说,它并没有得到很好的回报。

That's because most web application frameworks are built for handling requests for content, and that's done well enough with HTTP/1 or FastCGI. 这是因为大多数Web应用程序框架都是为处理内容请求而构建的,并且使用HTTP / 1或FastCGI就足够了。 Although there are exceptions, web applications have little use for the subtleties of HTTP/2: multiplexing, prioritization, all the myriad of security precautions, and so on. 虽然有例外,但Web应用程序几乎没有使用HTTP / 2的微妙之处:多路复用,优先级排序,所有无数的安全预防措施等等。

The resulting separation of concerns is in my opinion a good thing. 由此产生的关注点分离在我看来是件好事。


Your 80 ms response time may have little to do with the HTTP protocol you are using, but if those 80 ms are mostly spent waiting for input/output, then of course running things in parallel is a good thing. 您的80毫秒响应时间可能与您正在使用的HTTP协议没什么关系,但如果那些80毫秒主要用于等待输入/输出,那么当然并行运行是一件好事。

Gunicorn will use a thread or a process to handle each request (unless you have gone the extra-mile to configure the greenlets backend), so consider if letting Gunicorn spawn thousands of tasks is viable in your case. Gunicorn将使用一个线程或一个进程来处理每个请求(除非你已经花了额外的英里来配​​置greenlets后端),所以考虑让Gunicorn产生成千上万的任务是否可行。

If the content of your requests allow it, maybe you can create temporary files and serve them with an HTTP/2 edge server. 如果请求的内容允许,也许您可​​以创建临时文件并使用HTTP / 2边缘服务器提供它们。

It is now possible to serve HTTP/2 directly from a Python app, for example using Twisted . 现在可以直接从Python应用程序提供HTTP / 2,例如使用Twisted You asked specifically about a Flask app though, in which case I'd (with bias) recommend Quart which is the Flask API reimplemented on top of asyncio (with HTTP/2 support). 你特别询问了Flask应用程序,在这种情况下,我(偏见)推荐Quart ,它是在asyncio之上重新实现的Flask API(支持HTTP / 2)。

Your actual issue, 你的实际问题,

With HTTP1.0 and Python Requests as the client, each request takes ~80ms 使用HTTP1.0和Python请求作为客户端,每个请求大约需要80ms

suggests to me that the problem you may be experiencing is that each request opens a new connection. 向我建议您可能遇到的问题是每个请求都会打开一个新连接。 This could be alleviated via the use of a connection pool without requiring HTTP/2. 这可以通过使用连接池而不需要HTTP / 2来缓解。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM