This question already has answers here:
asynchronous python itertools chain multiple generators

(2个答案)


1年前关闭。




我想监听来自同一对象的多个实例的事件,然后将此事件流合并为一个流。例如,如果我使用异步生成器:
class PeriodicYielder:
    def __init__(self, period: int) -> None:
        self.period = period

    async def updates(self):
        while True:
            await asyncio.sleep(self.period)
            yield self.period

我可以成功监听来自一个实例的事件:
async def get_updates_from_one():
    each_1 = PeriodicYielder(1)
    async for n in each_1.updates():
        print(n)
# 1
# 1
# 1
# ...

但是,如何从多个异步生成器获取事件?换句话说:如何按准备好产生下一个值的顺序遍历多个异步生成器?
async def get_updates_from_multiple():
    each_1 = PeriodicYielder(1)
    each_2 = PeriodicYielder(2)
    async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
        print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...

在stdlib或3rd party模块中是否有这样的 magic_async_join_function

最佳答案

您可以使用精彩的aiostream库。它看起来像这样:

import asyncio
from aiostream import stream


async def test1():
    for _ in range(5):
        await asyncio.sleep(0.1)
        yield 1


async def test2():
    for _ in range(5):
        await asyncio.sleep(0.2)
        yield 2


async def main():
    combine = stream.merge(test1(), test2())

    async with combine.stream() as streamer:
        async for item in streamer:
            print(item)


asyncio.run(main())

结果:
1
1
2
1
1
2
1
2
2
2

关于python - 在Python中加入多个异步生成器,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/55299564/

10-09 18:32