简体   繁体   中英

Upstream Node server closing connection to nginx

I'm using nginx as a proxy for a Node server that's rate-limiting requests. The rate is one request every 30 seconds; most requests return a response fine, but if a request is kept open for an extended period of time, I get this:

upstream prematurely closed connection while reading response header from upstream

I cannot figure out what might be causing this. Below is my nginx configuration:

# For more information on configuration, see:
#   * Official English Documentation: http://nginx.org/en/docs/
#   * Official Russian Documentation: http://nginx.org/ru/docs/

user nginx;
worker_processes auto;
error_log /var/log/nginx/error.log;
pid /run/nginx.pid;

# Load dynamic modules. See /usr/share/nginx/README.dynamic.
# include /usr/share/nginx/modules/*.conf;

events {
    worker_connections 1024;
}

http {
    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
                      '$status $body_bytes_sent "$http_referer" '
                      '"$http_user_agent" "$http_x_forwarded_for"';

    access_log  /var/log/nginx/access.log  main;

    sendfile            on;
    tcp_nopush          on;
    tcp_nodelay         on;
    keepalive_timeout   65;
    types_hash_max_size 2048;

    include             /etc/nginx/mime.types;
    default_type        application/octet-stream;

    # Load modular configuration files from the /etc/nginx/conf.d directory.
    # See http://nginx.org/en/docs/ngx_core_module.html#include
    # for more information.
    include /etc/nginx/conf.d/*.conf;

    server {
        listen       80 default_server;
        listen       [::]:80 default_server;
        server_name  _;
        root         /srv/www/main/htdocs;

        # Load configuration files for the default server block.
        include /etc/nginx/default.d/*.conf;    

        location /vcheck {
            proxy_pass http://127.0.0.1:8080$is_args$query_string;
            # proxy_buffer_size 128k;
            # proxy_buffers 4 256k;
            # proxy_busy_buffers_size 256k;
            # proxy_http_version 1.1;
            # proxy_set_header Upgrade $http_upgrade;
            # proxy_set_header Connection 'upgrade';
            # proxy_set_header Host $host;
            # proxy_cache_bypass $http_upgrade;         
            # proxy_redirect off;

            proxy_read_timeout 600s;
        }

        location ~ \.php$ {
            include fastcgi.conf;
            fastcgi_split_path_info ^(.+\.php)(/.+)$;
            fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;
            fastcgi_index routes.php$is_args$query_string;
        }

        location / {
            if (-f $request_filename) {
                expires max;
                break;
            }

            if ($request_filename !~ "\.(js|htc|ico|gif|jpg|png|css)$") {
                rewrite ^(.*) /routes.php last;
            }
        }       

    }
}

Is there a reason why Node could be closing the connection early?

EDIT: I'm using Node's built-in HTTP server.

Seems like You've to extend response timeout of nodejs application.

So if it's expressjs app so I can guess You try this one:

install: npm i --save connect-timeout

use:

var timeout = require('connect-timeout');
app.use(timeout('60s'));



But I recommend to not to keep connection waiting and fix issue in nodejs app, find why it's halting so long.

Seems like nodejs app has issues that cannot respond and request is getting lost keeping nginx waiting.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM