问题描述
我正在celery workers
中使用Python requests
进行大量的(〜10/sec)API调用(包括GET,POST,PUT,DELETE).每个请求大约需要5到10秒才能完成.
I am using Python requests
in celery workers
to make large number of (~10/sec) API calls(includes GET,POST, PUT, DELETE). Each request takes around 5-10s to complete.
我尝试在eventlet
池中运行具有1000个并发性的芹菜工作者.
I tried running celery workers in eventlet
pool, with 1000 concurrency.
由于requests
阻塞了进程,每个并发连接都在等待一个请求.
Since requests
are blocking process each concurrent connection is waiting on one request.
如何使requests
异步?
推荐答案
使用eventlet 猴子补丁使任何纯python库都无阻塞.
Use eventlet monkey patching to make any pure python library non-blocking.
-
修补单个库
patch single library
# import requests # instead do this:
import eventlet
requests = eventlet.import_patched('requests')
软件包 erequests 和 grequests 可以简化为这两行.
packages erequests and grequests could be stripped down to these two lines.
修补所有内容
import eventlet
eventlet.monkey_patch() # must execute as early as possible
...
# everything is non-blocking now:
import requests, amqp, memcache, paramiko, redis
更新:存在已知问题,其中包含猴子补丁请求库.如果得到:
Update: there is known issue with monkey patching requests library. If you get:
ImportError: cannot import name utils
,然后将导入行修改为
requests = eventlet.import_patched('requests.__init__')
这篇关于Celery + Eventlet +非阻塞请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!