本文介绍了Python aiohttp请求已停止,但未引发异常的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 aiohttp 请求网址。在大多数情况下,它可以正常运行,但有时它会停止而不会引发任何异常。

I use aiohttp to request the url. Most of the time it runs normally, but sometimes it stops without raising any exception.

正如您在代码中所看到的,我捕获了所有异常,但是当它停止时

As you can see in the code, I catch all the exceptions, but when it stops no log of exceptions is printed.

日志看起来像:

get_live_league_games: while True
try
yield from aiohttp.request

获得
的收益,但是' res = r.json()的收益'不打印,它停止并且不引发任何异常。

but the 'res = yield from r.json()' does not print, it stops and does not throw any exceptions.

while True:
    print('get_live_league_games: while True')
    start = time.clock()
    try:
        print('try')
        r = yield from aiohttp.request('GET',url)
        print('yield from aiohttp.request')
        res = yield from r.json()
        print('res = yield from r.json()')
    except aiohttp.errors.DisconnectedError as e:
        logging.warning('get_live_league_games:',e)
        yield from asyncio.sleep(10)
        continue
    except aiohttp.errors.ClientError as e:
        logging.warning('get_live_league_games:',e)
        yield from asyncio.sleep(10)
        continue
    except aiohttp.errors.HttpProcessingError as e:
         logging.warning('get_live_league_games:',e)
         yield from asyncio.sleep(10)
         continue
    except Exception as e:
         logging.warning('get_live_league_games,Exception:',e)
         yield from asyncio.sleep(10)
         continue
    print('request internet time : ', time.clock()-start)
    yield from asyncio.sleep(10)


推荐答案

可能是由于互联网的特性而发生的-在断开连接错误出现之前,连接可能会挂起很长时间。

That may happen due internet nature -- connection may 'hang' for very long period before disconnection error raises.

这就是为什么您通常需要客户端HTTP操作超时的原因。

That's why you usually need timeout for client http operations.

我建议包装 aiohttp.request()调用 async io.wait_for

这篇关于Python aiohttp请求已停止,但未引发异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-22 11:53