简体   繁体   English

Nginx和Node.js服务器-多个任务

[英]Nginx and Node.js server - multiple tasks

UPDATE 更新

I have a few questions about the combination of Nginx and Nodejs. 我对Nginx和Nodejs的组合有一些疑问。

I've used Nodejs to create my server and now I'm facing with an issue about catching the server for an actions (writing, removing and etc..). 我使用Node.js创建服务器,现在遇到了有关捕获服务器进行操作(编写,删除等)的问题。 We are using Redis to lock the server when there are requests to the server, for example if a new user is doing a sign up action all the rest of the requests are waiting until the process is done, or if there is another process (longer one) all the other requests will wait longer. 当服务器有请求时,我们正在使用Redis锁定服务器,例如,如果新用户正在执行注册操作,则其余所有请求都在等待该过程完成,或者还有另一个过程(较长)一个)所有其他请求将等待更长的时间。

We thought about creating a Load balancer (using Nginx) that will check if the server is locked, and if the server is locked it will open a new task and won't wait until the first process is done. 我们考虑过创建一个负载均衡器(使用Nginx),该负载均衡器将检查服务器是否被锁定,如果服务器被锁定,它将打开一个新任务,并且不会等到第一个过程完成。

I used this tutorial and created a dummy server, then I've struggled with the idea of do this functionality of opening a new ports. 我使用了本教程并创建了一个虚拟服务器,然后我一直在尝试执行打开新端口的功能。

I'm new with load balancing implementation and I will be happy to hear your thoughts and help. 我是负载平衡实施的新手,很高兴听到您的想法和帮助。

Thank you. 谢谢。

The gist of it is that your server needs to not crash if more than one connection attempt are made to it. 要点是,如果对服务器进行了多次连接尝试,则服务器不必崩溃。 Even if you use NGINX as a load balancer and have five different instances of your server running...what happens when six clients try to access your app at once? 即使您将NGINX用作负载平衡器,并且正在运行服务器的五个不同实例,当六个客户端尝试一次访问您的应用程序时会发生什么?

I think you are thinking about load balancers slightly wrong. 我认为您在考虑负载均衡器有些错误。 There are different load balancing methods, but the simplest one to think about is "round robin" in which each connection gets forwarded to the next server in the list (the rest are just more robust and complicated versions of this one). 有不同的负载平衡方法,但是最简单考虑的是“循环”,其中每个连接都转发到列表中的下一个服务器(其余只是该连接的更强大和更复杂的版本)。 When there are no more servers to forward to, the next connection gets forwarded to the first server again (whether or not it is done with its last connection) and the circle starts over. 如果没有更多服务器要转发到,则下一个连接再次转发到第一个服务器(无论是否通过其最后一个连接完成),圈子重新开始。 Thus, load balancers aren't supposed to manage "unique connections" from clients...they are supposed to distribute connections among servers. 因此,负载平衡器不应管理来自客户端的“唯一连接”……它们应在服务器之间分配连接。

Your server doesn't necessarily need to accept connections and handle them all at once. 您的服务器不一定需要接受连接并立即处理所有连接。 But it needs to at least allow connections to queue up without crashing, and then accept and deal with each one by one. 但是它至少需要允许连接排队而不会崩溃,然后接受并逐个处理每个连接。

You can go the route you are discussing. 您可以走正在讨论的路线。 That is, you can fire up a unique instance of your server...via Heroku or other...for every single connection that is made to your app. 也就是说,您可以通过Heroku或其他方式为与应用程序建立的每个连接启动服务器的唯一实例。 But this is not efficient and will ultimately create more work for you in trying to architect a system that can do that well. 但这效率不高,最终会在尝试构建可以做到这一点的系统时为您创造更多的工作。 Why not just fix your server? 为什么不只修复服务器?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM