简体   繁体   English

Nginx:如何连续缓存响应?

[英]Nginx: how to continuously cache response?

I have a REST api on the backend and this api is cached by the Nginx (TTL is about 10m). I have a REST api on the backend and this api is cached by the Nginx (TTL is about 10m).

But I have a problem with number of calls to my backend api between moments when Nginx cache is expired and cache is established again.但是在 Nginx 缓存过期并再次建立缓存之间,我对后端 api 的调用次数存在问题。

This number of calls in such short period of time is too big and causes server overload.这么短的时间内调用的次数太多了,导致服务器过载。

How to continuously send cached response if Nginx burst limit exceeds?如果Nginx 突发限制超过,如何连续发送缓存响应? (Docs: Excessive requests are delayed until their number exceeds the maximum burst size in which case the request is terminated with an error ) (文档:过多的请求会被延迟,直到它们的数量超过最大突发大小,在这种情况下,请求会因错误而终止)

在此处输入图像描述

Can I send previously cached response with Nginx while my backend is processing a new one?当我的后端正在处理一个新的响应时,我可以使用 Nginx 发送以前缓存的响应吗? Is it possible to make custom trigger on burst limit event?是否可以对突发限制事件进行自定义触发?

在此处输入图像描述

Any advice or example will be appreciated!任何建议或示例将不胜感激!

As Igor (the guy behind Nginx) mentioned here:正如 Igor(Nginx 背后的人)在这里提到的:

You may bypass cache using "Some-Secret-Header: 1" in a request and setting: proxy_no_cache $http_some_secret_header;您可以在请求和设置中使用“Some-Secret-Header: 1”绕过缓存:proxy_no_cache $http_some_secret_header; The response may be cached.响应可能会被缓存。

(Source https://forum.nginx.org/read.php?2,99559,99567#msg-99567 ) (来源https://forum.nginx.org/read.php?2,99559,99567#msg-9956

Another Alternative would be to use the "Cache Purge" module here https://www.nginx.com/resources/wiki/modules/另一种选择是在此处使用“缓存清除”模块https://www.nginx.com/resources/wiki/modules/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM