本文介绍了运行多个并发Python程序以访问同一数据库表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
Python中是否有任何东西可以让您运行多个并发的Python程序,这些程序可能访问同一数据库表,并防止每个程序使用完整的CPU,从而使服务器剩余一些额外的容量?
Is there anything in Python that allows you to run multiple concurrent Python programs that could potentially access the same database table and to prevent each program from using the full cpu and thereby allow the server to have some additional capacity left over?
推荐答案
几个问题:
- 多个并发Python程序-开始请参见我会尝试使用内置模块多处理()
- 访问同一数据库表-每个进程应创建自己的数据库连接-在并发由rdbms和/或connection /中配置后查询选项。如果您确实需要在进程之间进行同步,则可以使用Locks / Semaphores。
- 防止每个程序使用完整的cpu-这取决于您的进程应该做什么,我可以这样做:
- 具有一个主要功能始终运行的程序(主进程),会暂停(time.sleep,gevent.sleep或类似进程),并生成并控制生成的进程(工作者)
- 生成的进程可以完成这项工作(工人)-打开新连接,执行数据库操作并退出
- multiple concurrent Python programs - see http://wiki.python.org/moin/Concurrency, for the start I would try with builtin module multiprocessing (http://docs.python.org/2/library/multiprocessing.html)
- access the same database table - each process should create own db connection - after that concurrency is managed by/or configured within rdbms and/or connection/query options. If you really need to sync between processes - using Locks/Semaphores could help.
- prevent each program from using the full cpu - it depends what your processes should do, I would go with:
- having one main program that runs all the time (master process), does pauses (time.sleep, gevent.sleep or similar) and spawns and controls spawned processes (workers)
- spawned processes do the job (workers) - open new connection, do db actions and quit
我确保多处理(或其他)模块提供的某些工作流/系统能够满足您的需求(工作人员,池,队列,管道,共享状态,同步等)。
I'm sure that some of the workflows/systems provided by multiprocessing (or other) modules could fit your needs (Workers, Pools, Queues, Pipes, Shared states, Synchronization, ...).
这篇关于运行多个并发Python程序以访问同一数据库表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!