本文介绍了不要使用Pool python打印堆栈跟踪的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用Pool同时运行多个命令.当用户中断脚本时,我不想打印堆栈跟踪.

I use a Pool to run several commands simultaneously. I would like to don't print the stack-trace when the user interrupt the script.

这是我的脚本结构:

def worker(some_element):
    try:
        cmd_res = Popen(SOME_COMMAND, stdout=PIPE, stderr=PIPE).communicate()
    except (KeyboardInterrupt, SystemExit):
        pass
    except Exception, e:
        print str(e)
        return

    #deal with cmd_res...

pool = Pool()
try:
    pool.map(worker, some_list, chunksize = 1)
except KeyboardInterrupt:
    pool.terminate()
    print 'bye!'

通过调用pool.terminated(),当KeyboardInterrupt升高时,我希望不打印堆栈跟踪,但是它不起作用,有时我会得到 :

By calling pool.terminated() when KeyboardInterrupt raises, I expected to don't print the stack-trace, but it doesn't works, I got sometimes something like:

^CProcess PoolWorker-6:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 102, in worker
    task = get()
  File "/usr/lib/python2.7/multiprocessing/queues.py", line 374, in get
    racquire()
KeyboardInterrupt
Process PoolWorker-1:
Process PoolWorker-2:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
Traceback (most recent call last):

...
bye!

你知道我怎么能藏起来吗?

Do you know how I could hide this?

谢谢.

推荐答案

在您的情况下,您甚至不需要池进程或线程.然后,使用try-catch使KeyboardInterrupts静音变得更加容易.

In your case you don't even need pool processes or threads. And then it gets easier to silence KeyboardInterrupts with try-catch.

当您的Python代码执行可从并行化中受益的占用CPU的计算时,池处理将非常有用.当您的Python代码执行可并行运行的复杂阻塞I/O时,线程很有用.您只想并行执行多个程序,然后等待结果.当您使用Pool时,创建的进程除了启动其他进程并等待它们终止外别无其他选择.

Pool processes are useful when your Python code does CPU-consuming calculations that can profit from parallelization.Threads are useful when your Python code does complex blocking I/O that can run in parallel. You just want to execute multiple programs in parallel and wait for the results. When you use Pool you create processes that do nothing other than starting other processes and waiting for them to terminate.

最简单的解决方案是并行创建所有进程,然后在每个进程上调用.communicate():

The simplest solution is to create all of the processes in parallel and then to call .communicate() on each of them:

try:
    processes = []
    # Start all processes at once
    for element in some_list:
        processes.append(Popen(SOME_COMMAND, stdout=PIPE, stderr=PIPE))
    # Fetch their results sequentially
    for process in processes:
        cmd_res = process.communicate()
        # Process your result here
except KeyboardInterrupt:
    for process in processes:
        try:
            process.terminate()
        except OSError:
            pass

当STDOUT和STDERR上的输出不太大时,此功能有效.否则,如果当前正在运行除communicate()以外的其他进程的PIPE缓冲区输出过多(通常约为1-8 kB),它将被OS挂起,直到在挂起的进程上调用communicate()为止.在这种情况下,您需要一个更复杂的解决方案:

This works when when the output on STDOUT and STDERR isn't too big. Else when another process than the one communicate() is currently running for produces too much output for the PIPE buffer (usually around 1-8 kB) it will be suspended by the OS until communicate() is called on the suspended process. In that case you need a more sophisticated solution:

从Python 3.4开始,您可以将asyncio模块用于单线程伪多线程处理:

Since Python 3.4 you can use the asyncio module for single-thread pseudo-multithreading:

import asyncio
from asyncio.subprocess import PIPE

loop = asyncio.get_event_loop()

@asyncio.coroutine
def worker(some_element):
    process = yield from asyncio.create_subprocess_exec(*SOME_COMMAND, stdout=PIPE)
    try:
        cmd_res = yield from process.communicate()
    except KeyboardInterrupt:
        process.terminate()
        return
    try:
        pass # Process your result here
    except KeyboardInterrupt:
        return

# Start all workers
workers = []
for element in some_list:
    w = worker(element)
    workers.append(w)
    asyncio.async(w)

# Run until everything complete
loop.run_until_complete(asyncio.wait(workers))

您应该能够使用例如asyncio.Semaphore如果需要的话.

You should be able to limit the number of concurrent processes using e.g. asyncio.Semaphore if you need to.

这篇关于不要使用Pool python打印堆栈跟踪的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 18:28