简体   繁体   English

使Python多处理。在两个函数之间进行管道处理

[英]Make Python multiprocessing.Pipe between two functions

I use a OneWire sensor (ds18b20) to read out a temperature and use it in a PI-algorithm to controll a SSR relay. 我使用OneWire传感器(ds18b20)读取温度,并在PI算法中使用它来控制SSR继电器。 I want to use a Pipe between the two functions to to send the temperature and make te "Reg" function to run as fast as possible. 我想在两个函数之间使用管道来发送温度,并使“ Reg”函数尽可能快地运行。 If I don't use a Pipe the Reg-function waits for the temperature-function (uses 0.75 seconds) and the output gets wrong... Can anyone please show me how to use the Pipe function.?? 如果我不使用管道,则Reg功能将等待温度功能(使用0.75秒),并且输出错误...谁能告诉我如何使用Pipe功能。

The code: 编码:

import time
import RPi.GPIO as GPIO
import os

GPIO.setwarnings(False)
GPIO.setmode(GPIO.BCM)
GPIO.setup(22, GPIO.OUT)


def temperatur(self):
 while True:

    tfile = open("/sys/bus/w1/devices/28-00000433f810/w1_slave")
    text = tfile.read()
    tfile.close()
    secondline = text.split("\n")[1]
    temperaturedata = secondline.split(" ")[9]
    temp2 = float(temperaturedata[2:])
    self.temp = temp2 / 1000
    print self.temp


def reg(self):

  while True:

    ek = self.ref - self.temp
    P_del = self.Kp * ek
    I_del = ((self.Kp * self.Syklustid) / self.Ti) * ek
    Paadrag = P_del + I_del
    if Paadrag > 100:
        Paadrag = 100
    if Paadrag < 0:
        Paadrag = 0    
    print "Paadrag: ", Paadrag, "  Temperatur: ", self.temp
    duty = Paadrag / 100.0
    on_time = self.Syklustid * (duty)
    off_time = self.Syklustid * (1.0-duty)
    print "On_time: ", on_time, "  Off_time: ", off_time
    GPIO.output(22, GPIO.HIGH)
    time.sleep(on_time)
    GPIO.output(22, GPIO.LOW)
    time.sleep(off_time

if __name__ == '__main__':

This is straight from the python documentation: http://docs.python.org/2/library/multiprocessing.html 这直接来自python文档: http : //docs.python.org/2/library/multiprocessing.html

from multiprocessing import Process, Pipe

def f(conn):
    conn.send([42, None, 'hello'])
    conn.close()

if __name__ == '__main__':
    parent_conn, child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print parent_conn.recv()   # prints "[42, None, 'hello']"
    p.join()

I've had better results using shared state. 使用共享状态的效果更好。 Especially for simple data like temperature (a number I assume - not a complex custom object or whatever) Here is a wee example (again you will find more in the python docs) 特别是对于简单的数据(例如温度)(我认为是一个数字-不是复杂的自定义对象或其他任何东西),这是一个小例子(同样,您会在python文档中找到更多信息)

#import stuff
from multiprocessing import Process, Manager

# Create a shared dictionary of paremeters for both processes to use 
manager = Manager()
global_props = manager.dict()
# SuperImportant - initialise all parameters first!!
global_props.update({'temp': 21.3})

def functionOne(global_props):
    # Do some stuff read temperature
    global_props['temp'] = newVal

def functionTwo(global_props):
    temp = global_props['temp']
    # Do some stuff with the new value

# assign listeners to the two processes, passing in the properties dictionary

handlerOne = functionOne  # This can also be a class in which case it calls __init__()
handlerTwo = functionTwo
processOne = Process(target=handlerOne, 
    args=(global_props))
processTwo = Process(target=handlerTwo, 
    args=(global_props))

# Start the processes running...
processOne.start()
processTwo.start()
processOne.join() 
processTwo.join()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM