问题描述
我的要求是每秒生成数百个 HTTP POST 请求
.我正在使用 urllib2
来做这件事.
My requirement is to generate hundreds of HTTP POST requests per second
. I am doing it using urllib2
.
def send():
req = urllib2.Request(url)
req.add_data(data)
response = urllib2.urlopen(req)
while datetime.datetime.now() <= ftime:
p=Process(target=send, args=[])
p.start()
time.sleep(0.001)
问题是这段代码有时对于某些迭代
会抛出以下异常之一:
The problem is this code sometimes for some iterations
throws either of following exceptions:
HTTP 503 Service Unavailable.
URLError: <urlopen error [Errno -2] Name or service not known>
我也尝试过使用 requests(HTTP for human)
,但我在使用该模块时遇到了一些代理问题.即使目标机器在同一个局域网内,似乎 requests
也在向代理服务器发送 http 数据包.我不希望数据包进入代理服务器.
I have tried using requests(HTTP for humans)
as well but I am having some proxy issues with that module. Seems like requests
is sending http packets to proxy server even when target machine is within same LAN. I don't want packets to go to proxy server.
推荐答案
限制并发连接数的最简单方法是使用线程池:
The simplest way to limit number of concurrent connections is to use a thread pool:
#!/usr/bin/env python
from itertools import izip, repeat
from multiprocessing.dummy import Pool # use threads for I/O bound tasks
from urllib2 import urlopen
def fetch(url_data):
try:
return url_data[0], urlopen(*url_data).read(), None
except EnvironmentError as e:
return url_data[0], None, str(e)
if __name__=="__main__":
pool = Pool(20) # use 20 concurrent connections
params = izip(urls, repeat(data)) # use the same data for all urls
for url, content, error in pool.imap_unorderred(fetch, params):
if error is None:
print("done: %s: %d" % (url, len(content)))
else:
print("error: %s: %s" % (url, error))
503 Service Unavailable
是服务器错误.它可能无法处理负载.
503 Service Unavailable
is a server error. It might fail to handle the load.
名称或服务未知
是 dns 错误.如果您需要提出很多要求;安装/启用本地缓存 dns 服务器.
Name or service not known
is a dns error. If you need make many requests; install/enable a local caching dns server.
这篇关于限制多处理python中的进程数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!