[英]Problems with serial communication and queues
I've some problems creating a multi-process serial logger. 创建多进程串行记录器时遇到一些问题。 The plan: Having a seperate process reading from serial port, putting data into a queue.
计划:让一个独立的进程从串行端口读取数据,并将数据放入队列。 The main process reads the entire queue after some time and processes the data.
一段时间后,主进程将读取整个队列并处理数据。
But I'm not sure if this is the right way to do it, because sometimes the data is not in the right order. 但是我不确定这是否是正确的方法,因为有时数据的顺序不正确。 It works well for slow communication.
它适用于慢速通信。
Do I have to lock something?! 我必须锁定一些东西吗? Is there a smarter way to do this?
有更聪明的方法吗?
import time
import serial
from multiprocessing import Process, Queue
def myProcess(q):
with serial.Serial("COM2", 115200, 8, "E", 1, timeout=None) as ser:
while True:
q.put("%02X" % ser.read(1)[0])
if __name__=='__main__':
try:
q = Queue()
p = Process(target=myProcess, args=(q,))
p.daemon = True
p.start()
data = []
while True:
print(q.qsize()) #!debug
while not q.empty(): #get all data from queue
data.append(q.get())
#proc_data(data) #data processing
time.sleep(1) #emulate data processing
del data[:] #clear buffer
except KeyboardInterrupt:
print("clean-up") #!debug
p.join()
Update: I tried another version based on threads (see code below), but with the same effect/problem. 更新:我尝试了基于线程的另一个版本(请参见下面的代码),但是具有相同的效果/问题。 The carry-over works fine, but one byte 'between' the carry-over and the new data is always gone -> The script will miss the byte when main reads the queue?!
结转工作正常,但是结转和新数据之间的一个字节总是消失了->当main读取队列时,脚本将丢失该字节吗?
import time, serial, threading, queue
def read_port(q):
with serial.Serial("COM2", 19200, 8, "E", 1, timeout=None) as ser:
while t.is_alive():
q.put("%02X" % ser.read(1)[0])
def proc_data(data, crc):
#processing data here
carry = data[len(data)/2:] #DEBUG: emulate result (return last half of data)
return carry
if __name__=='__main__':
try:
q = queue.Queue()
t = threading.Thread(target=read_port, args=(q,))
t.daemon = True
t.start()
data = []
while True:
try:
while True:
data.append(q.get_nowait()) #get all data from queue
except queue.Empty:
pass
print(data) #DEBUG: show carry-over + new data
data = proc_data(data) #process data and store carry-over
print(data) #DEBUG: show new carry-over
time.sleep(1) #DEBUG: emulate processing time
except KeyboardInterrupt:
print("clean-up")
t.join(0)
Consider the following code. 考虑下面的代码。
1) the two processes are siblings; 1)这两个过程是同级的; the parent just sets them up then waits for control-C to interrupt everything
父级只需设置它们,然后等待Control-C中断所有操作
2) one proc puts raw bytes on the shared queue 2)一个proc将原始字节放入共享队列
3) other proc blocks for the first byte of data. 3)其他proc块用于数据的第一个字节。 When it gets the first byte, it then grabs the rest of the data, outputs it in hex, then continues.
当它获得第一个字节时,它将获取其余数据,以十六进制形式输出,然后继续。
4) parent proc just sets up others then waits for interrupt using signal.pause()
4)父
signal.pause()
仅设置其他signal.pause()
然后使用signal.pause()
等待中断
Note that with multiprocessing
, the qsize()
(and probably empty()
) functions are unreliable -- thus the above code will reliably grab your data. 请注意,
qsize()
multiprocessing
, qsize()
(甚至可能是empty()
)函数是不可靠的-因此上述代码将可靠地捕获您的数据。
import signal, time
import serial
from multiprocessing import Process, Queue
def read_port(q):
with serial.Serial("COM2", 115200, 8, "E", 1, timeout=None) as ser:
while True:
q.put( ser.read(1)[0] )
def show_data(q):
while True:
# block for first byte of data
data = [ q.get() ]
# consume more data if available
try:
while True:
data.append( q.get_nowait() )
except Queue.Empty:
pass
print 'got:', ":".join("{:02x}".format(ord(c)) for c in data)
if __name__=='__main__':
try:
q = Queue()
Process(target=read_port, args=(q,)).start()
Process(target=show_data, args=(q,)).start()
signal.pause() # wait for interrupt
except KeyboardInterrupt:
print("clean-up") #!debug
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.