简体   繁体   English

如何在Python中使用TCP套接字发送和接收网络摄像头流?

[英]How to send and receive webcam stream using tcp sockets in Python?

I am trying to recreate this project . 我正在尝试重新创建该项目 What I have is a server (my computer), and a client (my raspberry pi). 我所拥有的是一台服务器(我的计算机)和一个客户端(我的树莓派)。 What I am doing differently than the original project is that I am trying to use a simple webcam instead of a raspberry pi camera to stream images from my rpi to the server. 我与原始项目的不同之处在于,我尝试使用简单的网络摄像头而不是树莓派pi相机将图像从我的rpi流传输到服务器。 I know that I must: 我知道我必须:

  1. Get opencv image frames from the camera. 从相机获取opencv图像帧。
  2. Convert a frame (which is a numpy array) to bytes. 将帧(是一个numpy数组)转换为字节。
  3. Transfer the bytes from the client to the server. 将字节从客户端传输到服务器。
  4. Convert the bytes back into frames and view. 将字节转换回帧并查看。

Examples would be appreciated. 例子将不胜感激。

self_driver.py self_driver.py

import SocketServer
import threading
import numpy as np
import cv2
import sys


ultrasonic_data = None

#BaseRequestHandler is used to process incoming requests
class UltrasonicHandler(SocketServer.BaseRequestHandler):

    data = " "

    def handle(self):

        while self.data:
            self.data = self.request.recv(1024)
            ultrasonic_data = float(self.data.split('.')[0])
            print(ultrasonic_data)


#VideoStreamHandler uses streams which are file-like objects for communication
class VideoStreamHandler(SocketServer.StreamRequestHandler):

    def handle(self):
        stream_bytes = b''

        try:
            stream_bytes += self.rfile.read(1024)
            image = np.frombuffer(stream_bytes, dtype="B")
            print(image.shape)
            cv2.imshow('F', image)
            cv2.waitKey(0)

        finally:
            cv2.destroyAllWindows()
            sys.exit()


class Self_Driver_Server:

    def __init__(self, host, portUS, portCam):
        self.host = host
        self.portUS = portUS
        self.portCam = portCam

    def startUltrasonicServer(self):
        # Create the Ultrasonic server, binding to localhost on port 50001
        server = SocketServer.TCPServer((self.host, self.portUS), UltrasonicHandler)
        server.serve_forever()

    def startVideoServer(self):
        # Create the video server, binding to localhost on port 50002
        server = SocketServer.TCPServer((self.host, self.portCam), VideoStreamHandler)
        server.serve_forever()

    def start(self):
        ultrasonic_thread = threading.Thread(target=self.startUltrasonicServer)
        ultrasonic_thread.daemon = True
        ultrasonic_thread.start()
        self.startVideoServer()


if __name__ == "__main__":

    #From SocketServer documentation
    HOST, PORTUS, PORTCAM = '192.168.0.18', 50001, 50002
    sdc = Self_Driver_Server(HOST, PORTUS, PORTCAM)

    sdc.start()

video_client.py video_client.py

import socket
import time
import cv2


client_sock = socket.socket()
client_sock.connect(('192.168.0.18', 50002))
#We are going to 'write' to a file in 'binary' mode
conn = client_sock.makefile('wb')

try:
    cap = cv2.VideoCapture(0)
    cap.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,320)
    cap.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,240)

    start = time.time()

    while(cap.isOpened()):
        conn.flush()
        ret, frame = cap.read()
        byteImage = frame.tobytes()
        conn.write(byteImage)


finally:
    finish = time.time()
    cap.release()
    client_sock.close()
    conn.close()

You can't just display every received buffer of 1-1024 bytes as an image; 您不能只将每个接收到的1-1024字节的缓冲区显示为图像。 you have to concatenate them up and only display an image when your buffer is complete. 您必须将它们串联起来,并且仅在缓冲区完成后才显示图像。

If you know, out of band, that your images are going to be a fixed number of bytes, you can do something like this: 如果您清楚地知道图像将是固定的字节数,则可以执行以下操作:

IMAGE_SIZE = 320*240*3

def handle(self):
    stream_bytes = b''

    try:
        stream_bytes += self.rfile.read(1024)
        while len(stream_bytes) >= IMAGE_SIZE:
            image = np.frombuffer(stream_bytes[:IMAGE_SIZE], dtype="B")
            stream_bytes = stream_bytes[IMAGE_SIZE:]
            print(image.shape)
            cv2.imshow('F', image)
            cv2.waitKey(0)
    finally:
        cv2.destroyAllWindows()
        sys.exit()

If you don't know that, you have to add some kind of framing protocol, like sending the frame size as a uint32 before each frame, so the server can know how many bytes to received for each frame. 如果不知道,则必须添加某种成帧协议,例如在每帧之前以uint32的形式发送帧大小,以便服务器可以知道每个帧要接收多少字节。


Next, if you're just sending the raw bytes, without any dtype or shape or order information, you need to embed the dtype and shape information into the server. 接下来,如果您只是发送原始字节,而没有任何dtype或形状或顺序信息,则需要将dtype和形状信息嵌入到服务器中。 If you know it's supposed to be, say, bytes in C order in a particular shape, you can do that manually: 例如,如果您知道它应该是C顺序中特定形状的字节,则可以手动执行此操作:

image = np.frombuffer(stream_bytes, dtype="B").reshape(320, 240, 3)

… but if not, you have to send that information as part of your framing protocol as well. …但是,如果没有,您也必须将该信息作为成帧协议的一部分发送。

Alternatively, you could send a pickle.dumps of the buffer and pickle.loads it on the other side, or np.save to a BytesIO and np.load the result. 或者,您可以发送pickle.dumps缓冲区,然后pickle.loads到另一侧,或者np.saveBytesIOnp.load结果。 Either way, that includes the dtype, shape, order, and stride information as well as the raw bytes, so you don't have to worry about it. 无论哪种方式,都包括dtype,形状,顺序和步幅信息以及原始字节,因此您不必担心。


The next problem is that you're exiting as soon as you display one image. 下一个问题是,一旦显示一张图像,您将退出。 Is that really what you want? 那真的是你想要的吗? If not… just don't do that. 如果没有,那就不要那样做。


But that just raises another problem. 但这只是另一个问题。 Do you really want to block the whole server with that cv.waitKey ? 您是否真的要使用该cv.waitKey阻止整个服务器? Your client is capturing images and sending them as fast as it can; 您的客户正在捕获图像并尽快发送它们; surely you either want to make the server display them as soon as they arrive, or change the design so the client only sends frames on demand. 当然,您要么希望服务器在它们到达时立即显示它们,要么更改设计,以便客户端仅按需发送帧。 Otherwise, you're just going to get a bunch of near-identical frames, then a many-seconds-long gap while the client is blocked waiting for you to drain the buffer, then repeat. 否则,您将得到一堆几乎相同的帧,然后是几秒钟长的间隙,而客户端被阻塞,等待您耗尽缓冲区,然后重复执行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM