简体   繁体   English

通过 UNIX 插座在 Python 和 Rust 之间发送和接收视频

[英]Send & receive video between Python and Rust via UNIX socket

The final requirement is to create a system which can stream video footage via a UNIX IPC socket from Python to Rust.最后的要求是创建一个系统,该系统可以通过 UNIX IPC 插座从 Python 到 Python 到 UNIX 视频片段 stream 视频片段。 The Python script has exclusive access to the camera/video. Python 脚本具有对相机/视频的独占访问权限。 I'm pretty new to Rust.我对 Rust 很陌生。

I have tried an approach wherein the control flows as follows:我尝试了一种方法,其中控制流程如下:

  • In Python, as the video flows in, strip each frame out of it to convert it into a NumPy array在 Python 中,随着视频的流入,将每一帧从中剥离出来以将其转换为 NumPy 数组
  • Then, convert it into a string.然后,将其转换为字符串。 Then to bytes.然后到字节。
  • Send it over a UNIX IPC socket.通过 UNIX IPC 插座发送。
  • At the receiving end, convert it back into string.在接收端,将其转换回字符串。
  • Hopefully parse it to get a usable array and make an image from it.希望解析它以获得可用的数组并从中制作图像。

Sender:发件人:

import socket, os, cv2, numpy
# print all the string without truncating
numpy.set_printoptions(threshold=numpy.inf)

# declare the camera socket and unlink path if already in use
camera_socket_path = '/home/user/exps/test.sock'
try:
    os.unlink(camera_socket_path)
except OSError:
    pass

# create and listen on the socket
camera_socket = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
camera_socket.bind(camera_socket_path)
camera_socket.listen()

conn, addr = camera_socket.accept()

vidcap = cv2.VideoCapture('example.mp4')
success, image = vidcap.read()
print(str(image))
while success:
    conn.send(bytes(str(image), 'utf-8'))      
    success, image = vidcap.read()
    break # after sending a frame, for testing

conn.close()

Receiver:接收者:

use std::os::unix::net::UnixStream;
use std::io::prelude::*;

use image::RgbImage;
use ndarray::Array3;

fn array_to_image(arr: Array3<u8>) -> RgbImage {
    assert!(arr.is_standard_layout());

    let (height, width, _) = arr.dim();
    let raw = arr.into_raw_vec();

    RgbImage::from_raw(width as u32, height as u32, raw)
        .expect("container should have the right size for the image dimensions")
}

fn main() {
    // create a standard UNIX IPX socket stream
    let mut stream = UnixStream::connect("/home/user/exps/test.sock").unwrap();

    loop {
        // 2074139 is the length of the string printed at sender :D
        // I thought it might work out but it didn't obviously
        // the docs suggest an exponential of 2. default was 1024
        let mut buf = [0; 2074139];
        let count = stream.read(&mut buf).unwrap();
        let response = String::from_utf8(buf[..2074139].to_vec()).unwrap();
        // write a parser function to convert the string to Array3 iterating through line()
        // let pic = array_to_image(convert(response));
        println!("{}", response);
        break; // get a single frame, for testing
    }
}

Both the strings look nothing like each other.两根弦看起来都不一样。 In the final version, I hope to create a stream using which I can continuously do this for incoming video stream from a camera, with something like cv2.VideoCapture() .在最终版本中,我希望创建一个 stream 使用它,我可以连续地对来自相机的传入视频 stream 执行此操作,例如cv2.VideoCapture() What is the list of stuff to be done to achieve it?实现它需要做的事情清单是什么?

An mp4 file consists of arbitrary byte sequences, not utf8 strings. mp4 文件由任意字节序列组成,而不是 utf8 字符串。 So you should send the raw bytes and receive into a Vec<u8> or similar data type.因此,您应该发送原始字节并接收到Vec<u8>或类似的数据类型。

There is no reason to mangle things through a string encodings, unix sockets support arbitrary bytes.没有理由通过字符串编码来破坏事物,unix sockets 支持任意字节。

For debugging purposes you can print out the bytes in hex format, but you shouldn't send it as such since that too would be wasteful.出于调试目的,您可以打印出十六进制格式的字节,但您不应该这样发送它,因为这也太浪费了。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM