简体   繁体   中英

Rails Puma running out of Redis connections

I've looked around at other similar questions on SO but can't quite piece things together well enough. I have a Rails app (on Heroku) that uses Puma with both multiple processes and multiple threads. My app also uses Redis as a secondary data store (in addition to a SQL database), querying Redis directly (well, through the connection_pool gem). Here's my Puma config file:

workers Integer(ENV["WEB_CONCURRENCY"] || 4)
threads_count = Integer(ENV["MAX_THREADS"] || 5)
threads threads_count, threads_count

preload_app!

rackup DefaultRackup
port ENV["PORT"] || 3000
environment ENV["RACK_ENV"] || "development"

on_worker_boot do
  # Worker specific setup for Rails 4.1+
  ActiveRecord::Base.establish_connection

  redis_connections_per_process = Integer(ENV["REDIS_CONNS_PER_PROCESS"] || 5)
  $redis = ConnectionPool.new(size: redis_connections_per_process) do
    Redis.new(url: ENV["REDIS_URL"] || "redis://localhost:6379/0")
  end
end

My Redis instance has a connection limit of 20, and I find myself regularly going over this limit, despite having what should be (as far as I can tell) only 5 connections per process spread across 4 worker processes.

In fact, I even get max number of clients reached Redis errors when I set REDIS_CONNS_PER_PROCESS to 1. Is on_worker_boot called for each thread rather than each process?

I've also tried having a separate redis.rb initializer, which still gives me errors even when REDIS_CONNS_PER_PROCESS is 1. This seems odd since I should be able to have it up to 4 if I'm doing my math correctly (4 worker processes + 1 master process) * 4 connections per process. (Note that for the purposes of this question I'm ignoring errors that occur around deploying, since I'm assuming Heroku might be connecting both old and new processes during that process, even though I'm not using Preboot.)

Where am I misunderstanding how this all fits together?

I had similar problem. At first I was using redis-togo, and it has no problem. but After I changed from redis-togo to Heroku redis, I got "ERR max number of clients reached" erros.

My app's code is not changed, redis provider's changing was the only one.

I opened a ticket at Heroku support, and they advised me to change the default setting of timeout value.

https://devcenter.heroku.com/articles/heroku-redis#configuring-your-instance

after I changed the default timeout value of Heroku redis, everyting was solved. I guess the default value of redis timeout is different by redis providers. and Heroku redis's default setting is 0. "A value of zero means that connections will not be closed."

I wish my experience is helpful.

After further reading and testing, I ended up moving my Redis connection pool code into a separate initializer. Unfortunately, this didn't solve my problem at all—despite lots of tinkering with process and connection numbers, I was still getting max number of clients reached errors way before I should have been.

The answer, it turns out, was to switch Redis providers from Heroku Redis to Redis Cloud. I'm not sure why Heroku Redis wasn't allowing the number of connections it advertises, but upon some investigation Redis Cloud actually appears to allow more connections than advertised (or at least limit connections transparently and without errors) without any issues whatsoever. Wow. They've certainly earned my business.

I ran into this problem also, and while the Heroku Redis dashboard showed only a few connections, I was running out of connections.

Then I contacted Heroku support, and they told me that the dashboard only shows active clients/connections , and not the idle ones.

So because of the Redis timeout being 0 (never timeout), on reboot the Redis connections idle and new ones are opened. So the situation gets worse on every reboot .

A solution, as mentioned by others on this page, is to set the timeout to something else than 0:

heroku redis:timeout -s 10 -a APPLICATION_NAME

This makes the connections die after 10 seconds, which shouldn't be a problem because while it's being used it will stay open (no unnecessary closes).

When you have only little traffic, you might consider setting this to something a bit higher.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM