优雅地杀死一个进程

优雅地杀死一个进程

本文介绍了Python多处理:优雅地杀死一个进程的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

import multiprocessing
import schedule


def worker():
     #do some stuff


def sched(argv):
    schedule.every(0.01).minutes.do(worker)
    while True:
        schedule.run_pending()


processs = []
..
..
p = multiprocessing.Process(target=sched,args)
..
..
processs.append(p)

for p in processs:
    p.terminate()

优雅地杀死进程列表吗?

kills gracefully a list of processes ?

如果不是最简单的方法是什么?

If not what is the simplest way to do it ?

目标是将配置文件重新加载到内存中,因此我想杀死所有子进程并创建其他子进程,后者将读取新的配置文件.

The goal is to reload the configuration file into memory, so I would like to kill all children processes and create others instead, those latter will read the new config file.

编辑:添加了更多代码,以解释我正在运行 while True 循环

Edit : Added more code to explain that I am running a while True loop

编辑:这是@dano之后的新代码

Edit : This is the new code after @dano suggestion

def get_config(self):
        from ConfigParser import SafeConfigParser
..
        return argv

def sched(self, args, event):
#schedule instruction:
        schedule.every(0.01).minutes.do(self.worker,args)
        while not  event.is_set():
                schedule.run_pending()

def dispatch_processs(self, conf):
        processs = []
        event = multiprocessing.Event()

        for conf in self.get_config():
                process = multiprocessing.Process(target=self.sched,args=( i for i in conf), kwargs={'event' : event})
                processs.append((process, event)
return processs

def start_process(self, process):
        process.start()

def gracefull_process(self, process):
        process.join()

def main(self):
        while True:
                processs = self.dispatch_processs(self.get_config())
                print ("%s processes running " % len(processs) )

                for process, event in processs:

                        self.start_process(process)
                        time.sleep(1)
                        event.set()
                        self.gracefull_process(process)

关于代码的好处是,我可以编辑配置文件,并且该过程还将重新加载其配置.

The good thing about the code, is that I can edit config file and the process will reload its config also.

问题是只有第一个进程运行,其他进程被忽略.

The problem is that only the first process runs and the others are ignored.

编辑:这可以挽救我的生命,而在schedule()中使用True并不是一个好主意,所以我改为设置 refresh_time

Edit : This saved my life , working with while True in schedule() is not a good idea, so I set up refresh_time instead

def sched(self, args, event):

    schedule.every(0.01).minutes.do(self.worker,args)
    for i in range(refresh_time):
            schedule.run_pending()
            time.sleep(1)

def start_processs(self, processs):
        for p,event in processs:
                if not p.is_alive():
                        p.start()
                time.sleep(1)
                event.set()

        self.gracefull_processs(processs)

def gracefull_processs(self, processs):
        for p,event in processs:
                p.join()
        processs = self.dispatch_processs(self.get_config())
        self.start_processs(processs)

def main(self):

        while True:
                processs = self.dispatch_processs(self.get_config())

                self.start_processs(processs)
                break
        print ("Reloading function main")
        self.main()

推荐答案

如果您不介意仅在 worker 完成所有工作之后中止,则添加 multiprocessing.Event 以正常退出:/p>

If you don't mind only aborting after worker has completed all of its work, its very simple to add a multiprocessing.Event to handle exiting gracefully:

import multiprocessing
import schedule


def worker():
     #do some stuff

def sched(argv, event=None):
    schedule.every(0.01).minutes.do(worker)
    while not event.is_set():  # Run until we're told to shut down.
        schedule.run_pending()

processes = []
..
..
event = multiprocessing.Event()
p = multiprocessing.Process(target=sched,args, kwargs={'event' : event})
..
..
processes.append((p, event))

# Tell all processes to shut down
for _, event in processes:
    event.set()

# Now actually wait for them to shut down
for p, _ in processes:
    p.join()

这篇关于Python多处理:优雅地杀死一个进程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-20 06:31