问题描述
对于
PostgreSQL服务器可以服务的并行连接数有什么实际限制吗?我们正在设置一个系统
,这将需要多达10000个并行打开的连接。查询负载
不是问题,但我们想知道连接数。
有没有人有这些数字的经验?
---------------------------(广播结束)---------- -----------------
提示1:订阅和取消订阅命令转到
Is there any practical limit on the number of parallel connections that a
PostgreSQL server can service? We''re in the process of setting up a system
that will require up to 10000 connections open in parallel. The query load
is not the problem, but we''re wondering about the number of connections.
Does anyone have experience with these kinds of numbers?
---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to ma*******@postgresql.org
推荐答案
没有经验,但有点思考和小学数学告诉我,你需要大量的RAM来支持10000个连接,
自postgres是一个多进程。我们典型的postgres过程会占用5-40美元的内存,具体取决于活动。因此,即使它只有5兆,
与10k连接我们正在谈论50G的RAM。如果这些
连接闲置,那将是对资源的简单浪费。
我建议你研究某种连接池。
-
Michal Taborsky
---------------------------(...结束广播)---------------------------
提示8:解释分析是你的朋友
No experience, but a little thinking and elementary school math tells
me, that you''d need huge amount of RAM to support 10000 connections,
since postgres is multi-process. Our typical postgres process eats 5-40
megs of memory, depending on activity. So even if it was just 5 megs,
with 10k connections we are talking about 50G of RAM. If these
connections are idle, it would be plain waste of resources.
I suggest you look into some sort of connection pooling.
--
Michal Taborsky
http://www.taborsky.cz
---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend
没有经验,但有点思考和小学数学告诉我,你需要大量的RAM来支持10000个连接,因为postgres是多进程的。我们典型的postgres过程会占用5-40美分的内存,具体取决于活动。因此,即使它只有5兆,
与10k连接,我们正在谈论50G的RAM。如果这些连接处于空闲状态,那将极其浪费资源。
我建议你研究某种连接池。
No experience, but a little thinking and elementary school math tells
me, that you''d need huge amount of RAM to support 10000 connections,
since postgres is multi-process. Our typical postgres process eats 5-40
megs of memory, depending on activity. So even if it was just 5 megs,
with 10k connections we are talking about 50G of RAM. If these
connections are idle, it would be plain waste of resources.
I suggest you look into some sort of connection pooling.
- --------------------------(广播结束)------------------- --------
提示8:解释分析是你的朋友
---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend
真实服务器前面已经有一个连接池,但是
$ b如果你实际上有10000个并发的
请求,$ b连接池对你没有帮助,它只能节省连接启动工作。 (你可以让
连接池服务器对请求进行排队,但这不是这个
练习的重点。)我没有完全考虑RAM问题,但这台机器几乎足够大,以至于没关系。我正在考虑内部结构或(Linux 2.6)内核的实际限制
。
------ ---------------------(广播结束)------------------------ ---
提示3:如果通过Usenet发布/阅读,请发送适当的
subscribe-nomail命令给,以便您的
消息可以干净地通过邮件列表
There is already a connection pool in front of the real server, but the
connection pool doesn''t help you if you have in fact 10000 concurrent
requests, it only saves connection start effort. (You could make the
connection pool server queue the requests, but that is not the point of this
exercise.) I didn''t quite consider the RAM question, but the machine is
almost big enough that it wouldn''t matter. I''m thinking more in terms of the
practical limits of the internal structures or the (Linux 2.6) kernel.
---------------------------(end of broadcast)---------------------------
TIP 3: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to ma*******@postgresql.org so that your
message can get through to the mailing list cleanly
这篇关于成千上万的并行连接的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!