简体   繁体   English

2个进程之间的raspberry pi 3多处理队列同步

[英]raspberry pi 3 multiprocessing queue syncronization between 2 processes

I have done a simple code using the multiprocessing library to build an extra process apart from the main code (2 processes in total). 我已经使用多处理库到远离主代码建立一个额外的处理(共计2个过程)完成一个简单的代码。 I did this code on W7 Professional x64 through Anaconda-spyder v3.2.4 and it works almost as I want except for the fact that when I run the code it increase the memory consumption of my second process (not the main one) until it reaches the total capacity and the computer got stuck and freezed (you can notice this at the whindows task manager). 我是通过Anaconda-spyder v3.2.4在W7 Professional x64上执行此代码的,它几乎可以按我的要求工作,除了以下事实:运行代码会增加第二个进程(不是主要进程)的内存消耗,直到达到总容量和计算机被卡住并冻结(您可以在whindows任务管理器中注意到这一点)。

"""
Example to print data from a function using multiprocessing library
Created on Thu Jan 30 12:07:49 2018
author: Kevin Machado Gamboa
Contct: ing.kevin@hotmail.com
"""
from time import time
import numpy as np
from multiprocessing import Process, Queue, Event

t0=time()

def ppg_parameters(hr, minR, ampR, minIR, ampIR, t):
    HR = float(hr)
    f= HR * (1/60)
    # Spo2 Red signal function
    sR = minR + ampR * (0.05*np.sin(2*np.pi*t*3*f)
                       + 0.4*np.sin(2*np.pi*t*f) + 0.25*np.sin(2*np.pi*t*2*f+45))
    # Spo2 InfraRed signal function
    sIR = minIR + ampIR * (0.05*np.sin(2*np.pi*t*3*f)
                          + 0.4*np.sin(2*np.pi*t*f) + 0.25*np.sin(2*np.pi*t*2*f+45))
    return sR, sIR

def loop(q):
    """
    generates the values of the function ppg_parameters
    """
    hr =  60
    ampR = 1.0814       # amplitud for Red signal
    minR = 0.0   # Desplacement from zero for Red signal
    ampIR = 1.12       # amplitud for InfraRed signal
    minIR = 0.7   # Desplacement from zero for Red signal
    # infinite loop to generate the signal
    while True:
        t = time()-t0
        y = ppg_parameters(hr, minR, ampR, minIR, ampIR, t)
        q.put([t, y[0], y[1]])

if __name__ == "__main__":
    _exit = Event()
    q = Queue()
    p = Process(target=loop, args=(q,))
    p.start()
    # starts the main process
    while q.qsize() != 1:
        try:
            data = q.get(True,2) # takes each data from the queue
            print(data[0], data[1], data[2])
        except KeyboardInterrupt:
            p.terminate()
            p.join()
            print('supposed to stop')
            break

Why is this happening? 为什么会这样呢? Perhaps is the while loop of my 2nd process? 也许是我的第二个过程的while循环? I don't know. 我不知道。 I haven't seen this issue nowhere. 我从来没有见过这个问题。

Moreover, if I run the same code on my Rpi 3 model B, there is a point when it pops an error that said "the queue is empty" something like if the main process is running faster than process two. 而且,如果我在我的Rpi 3模型B中运行相同的代码,有一个点时,它会弹出一个错误,说:“队列为空”有点像,如果主流程的运行速度低于过程中的两个快。

Please any guess of why is this happening, suggestion or link would be helpful. 请的为什么会这样,建议或链接任何猜测将是有益的。

Thanks 谢谢

It looks like inside your infinite loop you are adding to the queue and I'm guessing that you are adding data faster than it can be taken off of the queue by the other process. 看起来您正在添加到队列中的无限循环内部,我猜想您添加数据的速度快于其他进程将其从队列中移出的速度。

You could check the queue size periodically from inside the infinite loop and if it is over a certain amount (say 500 items), then you could sleep for a few seconds and then check again. 您可以从无限循环内部定期检查队列大小,如果队列大小超过一定数量(例如500个项目),则可以睡几秒钟,然后再次检查。

https://docs.python.org/2/library/queue.html#Queue.Queue.qsize https://docs.python.org/2/library/queue.html#Queue.Queue.qsize

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在不使用多处理模块的情况下在Python进程之间排队 - Queue between Python processes without using the multiprocessing module 在进程之间共享队列 - Share queue between processes Python多处理:RuntimeError:“只应通过继承在进程之间共享队列对象” - Python multiprocessing: RuntimeError: “Queue objects should only be shared between processes through inheritance” 在Python多处理中,如何使用队列在进程之间传递用户定义的对象? - In Python multiprocessing, how can a user-defined object be communicated between processes using a queue? 什么IPC机制用于在两个Python进程之间的multiprocessing.Queue中共享数据? - What IPC mechanism is used to share the data in multiprocessing.Queue between two Python processes? 如何添加可用于多处理队列的进程池 - How to add a pool of processes available for a multiprocessing queue 跨衍生进程复制“multiprocessing.Queue” - Duplicating a `multiprocessing.Queue` across spawned processes 进程之间进行通信:Python多处理 - Communicating between Processes: Python Multiprocessing 使用 UART 在 2 个 Raspberry Pi 之间进行通信 - Communicate between 2 Raspberry Pi with UART 使用队列在进程之间进行通信 - Using queue to communicate between processes
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM