简体   繁体   中英

How to share OpenCV images in two python programs?

I have three python files: glob_var.py, read_cam.py, read_globVar.py. Their contents are as below: glob_var.py:

globVar = {}
def set(name, val):
    globVar[name] = val

def get(name):
    val = globVar.get(name, None)
    return val

read_cam.py

import cv2
import glob_var

if __name__ == '__main__':
    cam = cv2.VideoCapture(0)
    key = 0
    while key != 27:
        ret, img = cam.read()
        cv2.imshow('img', img)

        key = cv2.waitKey(1) & 0xFF
        glob_var.set('image', img)

read_globVar.py

import glob_var
import cv2
from time import sleep

if __name__ == '__main__':
    key = 0
    while key != 27:
        img = glob_var.get('image')
        if img is None:
            print(f"no image in globVar")
            sleep(1)
            continue

        print(f"read image with shape {img.shape}")
        cv2.imshow('image', img)
        key = cv2.waitKey(1) & 0xFF

From those three python flies, I think you guys know what I want to do. Yes, I want read_cam.py to read images from the camera and broadcast it to a global variable. Then read_globVar.py can get the image an show it. I run read_cam.py in one terminal and read_globVar.py in another one. But I did not make it work properly. Is what I am thinking possible? How can I manage it? Thanks a lot!

=====update1: Pub and Sub in python=====
I have used the ROS(Robot Operating System) system for a while. It provide the pub and sub funtion to exchange variables between different programs or so called node. So my question is that is there any package in python provide such function? Redis provide this, is it the fastest or best way?

You could use Redis to do this. It is a very fast, in-memory data structure server that can serve strings, integers, hashes, lists, queues, sets, ordered sets, images. It is free and simple to install on macOS , Linux and Windows .

Also, you can read or write Redis values with bash , Python, PHP, C/C++ or many other languages. Furthermore, you can read or write to or from a server across the network or across the world, just change the IP address in the initial connection. So, effectively you could acquire images in Python on your Raspberry Pi under Linux and store them and process them on your PC under Windows in C/C++.

Then you just put your images into Redis , named as Camera1 or Entrance or put them in a sorted hash so you can buffer images by frame number. You can also give images (or other data structures) a "Time-To-Live" so that your RAM doesn't fill up.

Here's the bones of your code roughly rewritten to use Redis . No serious error checking or flexibility built in for the moment. It all runs fine.

Here is read_cam.py :

#!/usr/bin/env python3

import cv2
import struct
import redis
import numpy as np

def toRedis(r,a,n):
   """Store given Numpy array 'a' in Redis under key 'n'"""
   h, w = a.shape[:2]
   shape = struct.pack('>II',h,w)
   encoded = shape + a.tobytes()

   # Store encoded data in Redis
   r.set(n,encoded)
   return

if __name__ == '__main__':

    # Redis connection
    r = redis.Redis(host='localhost', port=6379, db=0)

    cam = cv2.VideoCapture(0)
    key = 0
    while key != 27:
        ret, img = cam.read()
        cv2.imshow('img', img)

        key = cv2.waitKey(1) & 0xFF
        toRedis(r, img, 'image')

And here is read_globvar.py :

#!/usr/bin/env python3

import cv2
from time import sleep
import struct
import redis
import numpy as np

def fromRedis(r,n):
   """Retrieve Numpy array from Redis key 'n'"""
   encoded = r.get(n)
   h, w = struct.unpack('>II',encoded[:8])
   a = np.frombuffer(encoded, dtype=np.uint8, offset=8).reshape(h,w,3)
   return a

if __name__ == '__main__':
    # Redis connection
    r = redis.Redis(host='localhost', port=6379, db=0)

    key = 0
    while key != 27:
        img = fromRedis(r,'image')

        print(f"read image with shape {img.shape}")
        cv2.imshow('image', img)
        key = cv2.waitKey(1) & 0xFF

Note that you could equally store the image height and width in a JSON and store that in Redis instead of the struct.pack and struct.unpack stuff I did.

Note too that you could encode your image as a JPEG in memory and store the JPEG in Redis (instead of a Numpy array) and that might save memory and network bandwidth.

Either way, the concept of using Redis is the same.

You can use a shared array from Python's multiprocessing module to quickly share large volumes of data between processes. I don't have any completed, tested code for you like the Redis answer I suggested, but I have enough to hopefully get you started.

So you would use:

from multiprocessing import Process, Queue
from multiprocessing.sharedctypes import Array
from ctypes import c_uint8

Then in your main , you would declare a large Array, probably big enough for say 2-4 of your large images:

bufShape = (1080, 1920,3) # 1080p

and

# Create zeroed out shared array
buffer = Array(c_uint8, bufShape[0] * bufShape[1] * bufShape[2])
# Make into numpy array
buf_arr = np.frombuffer(buffer.get_obj(), dtype=c_uint8)
buf_arr.shape = bufShape

# Create a list of workers
workers = [Worker(1, buffer, str(i)) for i in range(2)]

# Start the workers
for worker in workers:
    worker.start()

Then you would derive your workers from the Process class like this:

class Worker(Process):
    def __init__(self, q_size, buffer, name=''):
        super().__init__()
        self.queue = Queue(q_size)
        self.buffer = buffer
        self.name = name

    def run(self,):
        buf_arr = np.frombuffer(self.buffer.get_obj(), dtype=c_uint8)
        buf_arr.shape = bufShape
        while True:
            item = self.queue.get()
            ...

You can see at the start of run() that the worker just makes a Numpy Array from the big shared buffer, so the worker is reading what the main program is writing but hopefully you synchronise it so that while main is writing frames 2-4, a worker is reading frame 1.

Then hopefully, you can see that the main program can tell a worker that there is a frame of data by writing a simple frame index into the worker's queue (rather than sending the whole frame itself) by using:

worker.queue.put(i)

I have written an example of how to share images using memory-mapped file here: https://github.com/off99555/python-mmap-ipc

It's a feature that's already available in most languages. The basic idea is that we will write the image to a virtual file and then read it on another process. It has latency around 3-4ms which is minimal compared to the latency that is inherent in the camera. This approach is faster than internet protocols like TCP/IP, HTTP, etc. I've already tested with gRPC and ZeroMQ. They are all slower than the memory-mapped file approach.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM