相当于Java的ForkJoinPool

相当于Java的ForkJoinPool

本文介绍了Python 的 Fork-join 模型实现?(相当于Java的ForkJoinPool)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在寻找 fork-join 模型的实现 用于 Python.作为 Java 的 ForkJoinPool,它应该允许递归地将任务的工作拆分(分叉)为多个子任务.子任务完成后,将结果合并并返回.理想情况下,它应该支持类似于 concurrent.futures 中的 ThreadPoolExecutor 和 ProcessPoolExecutor 的线程和进程,但线程现在更重要.它必须允许限制线程数(我希望每个内核有一个线程).我知道这只有在代码发布 GIL 时才有用.

I am looking for an implementation of the fork-join model for Python. As Java's ForkJoinPool, it should allow to split (fork) the work of a task into several sub tasks recursively. Once the sub tasks are completed, the results are joined and returned. Ideally, it should support threads and processes similar to the ThreadPoolExecutor and ProcessPoolExecutor in concurrent.futures, but threads are more important for now. It must allow to limit the number of threads (I want to have one thread per core). I am aware that this will only be useful if the code releases the GIL.

来自维基百科的示例,用于阐明 fork-join 模型:

Example from Wikipedia to clarify fork-join model:

solve(problem):
    if problem is small enough:
        solve problem directly (sequential algorithm)
    else:
        for part in subdivide(problem)
            fork subtask to solve(part)
        join all subtasks spawned in previous loop
        return combined results

Python 中有这样的库吗?我找不到.

Is there such a library in Python? I could not find one.

推荐答案

我想你想要的是收集结果, multiprocessing.starmap() 可能是选择,这里是例子

I figured that what you want is to collect the result, multiprocessing.starmap() might be the choice, here goes the example

import multiprocessing as mp

def func(x, y):
    return x + y

l = list()
with mp.Pool(mp.cpu_count()) as p:
    l = p.starmap(func, [(1,2), (2,3), (3,4)])

print(l)  # result in [3, 5, 7]

这篇关于Python 的 Fork-join 模型实现?(相当于Java的ForkJoinPool)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 06:08
查看更多