简体   繁体   中英

How many connections can postgresql perform better?

I have 3 tables which are accessed every time. My PHP web page sends request to an intermediate C-language based layer which in-turn transfers the calls to postgresql. However, when I browse from a single tab, all the records are fetched properly. But, when I make more requests from many browsers/tabs, at least half of them fails and as its an embedded domain, debugging is the real problem. I suspect that database connections from various browsers concurrently are put into queues, and something blunder might happen. Could anybody please convey your thoughts on this production issue?

Don't guess, know.

You can get PostgreSQL to log connect and disconnects. It's probably already logging an error message for you. Check the logs.

Difficult to say more without knowing what this "C" layer is doing. You may find a connection-pool helpful. Start by looking at pgbouncer perhaps.

If you're not getting any errors from the C layer, check the Postgresql log file to see if you're getting errors in there.

My initial guesses would be that either:

a) The C layer is getting connections confused between threads and doing two things with the same connection at once.

or b) The C layer isn't releasing connections when its done with them, so you're hitting memory limits.

Either way, you really need to be able to find the error before you can progress it.

Postgresql itself will handle lots of concurrent connections, although you'd probably want to pool them to reduce setup/teardown times of each session.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM