简体   繁体   English

从 main 终止多处理进程

[英]Terminating a multiprocessing process from main

I am using Redis Pubsub to trigger a child process and save a reference to it.我正在使用 Redis Pubsub 来触发子进程并保存对它的引用。 I would like to terminate the previous process when the next message comes in. Unfortunately, though I can see the child process in debugger, and it has a terminate() function, it does not seem like main can see it-I get an error saying 'NoneType' object has no attribute 'terminate'.我想在下一条消息进来时终止前一个进程。不幸的是,虽然我可以在调试器中看到子进程,并且它有一个 terminate() function,但似乎 main 看不到它 - 我收到一个错误说“NoneType”object 没有“终止”属性。 Is there a straightforward way to terminate the process?有没有一种直接的方法来终止这个过程?

My code (in ' main ':我的代码(在' main '中:

conn = redis.Redis(host="localhost", port="6379")
if not conn.ping():
        raise Exception('Redis unavailable')

pubsub = conn.pubsub()
pubsub.subscribe("feed")
data = None
loaderProcess = None

for message in pubsub.listen():
    logging.info("received pubsub message")
    logging.info(message)
    logging.info(message['type'])
    if message['type'] == "message":
        data = json.loads(message.get("data"))
        if data and data['source']:
            try:
                if loaderProcess is not None:
                    loaderProcess.terminate()
                    loaderProcess.join()
                args.infile = data['source']
                loader = Video(infile=data.get("source"), fps=30.0)
                loaderProcess = multiprocessing.Process(target=load, args = (loader, conn, args,))                    
            except Exception as e:
                logging.error("Error occurred", exc_info=True)

Stack trace:堆栈跟踪:

ERROR:root:Error occurred
Traceback (most recent call last):
  File "C:\video-analysis\capture.py", line 140, in <module>
    loaderProcess.terminate()
  File "C:\Users\bkogan\AppData\Local\Programs\Python\Python39\lib\multiprocessing\process.py", line 133, in terminate
    self._popen.terminate()

OK, got it.好的,我知道了。 A few issues.几个问题。 One was that I was not starting the process explicitly.一是我没有明确地开始这个过程。 Another is that my reference to the process was not working.另一个是我对这个过程的引用不起作用。 Adding the process to a list works, though:但是,将进程添加到列表中是可行的:

procs = []

for message in pubsub.listen():
    try:
        if message['type'] == "message":
            data = json.loads(message.get("data"))
            if data and data['source']:
                    for proc in procs:
                        if proc.is_alive():
                            proc.terminate()
                            proc.join(timeout=0)
                            procs.pop(0)
                    loaderProcess = multiprocessing.Process(target=load, args = (data.get("source"), args,))
                    procs.append(loaderProcess)
                    loaderProcess.start()
                    continue
    except Exception as e:
        logging.error("Error occurred", exc_info=True)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用多重处理将对象从主流程传递到子流程 - How to pass object from main process to child process using multiprocessing Python 多处理进程是否可以从 __main__ 以外的其他地方启动? - Can Python multiprocessing process be launched from OTHER than __main__? python multiprocessing:如何从子进程修改在主进程中创建的字典? - python multiprocessing: How to modify a dictionary created in the main process from a subprocess? Python 多处理(Flask) - 从主进程调用 function - Python Multiprocessing (Flask) - Calling a function from main process Python 多处理:主进程的名称 - Python multiprocessing: name of the main process 对主进程的多处理回调 - Python - Multiprocessing Callbacks to Main Process - Python 从Timer线程终止进程 - Terminating process from Timer thread 如何在不获取新 PID 的情况下从 multiprocessing.Process 调用主进程中的方法? - how to call method in the main process from a multiprocessing.Process without getting a new PID? multiprocessing.Process 从进程创建行到行尾重复调用主 function? - multiprocessing.Process calls main function from process creation line to end of line repetitively? 多重处理:主程序停止,直到过程完成 - Multiprocessing: main programm stops until process is finished
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM