簡體   English   中英

如何通過將 gstreamer 與 python 子進程模塊或 gst-launch-1.0 命令一起使用來接收字節流?

[英]How to receive byte-stream by using gstreamer with python subprocess module or gst-launch-1.0 command?

我想通過將 gstreamer 與 python 子進程模塊結合使用來接收字節流。 現在我可以成功地使用 ffmpeg 來拉字節流了。 如下所示。

import cv2
import subprocess as sp


height = 714
width = 420
rtsp_url = 'rtsp://127.0.0.1:8554/video'

# command
command = ['ffmpeg',
            '-i', rtsp_url,
            '-f', 'rawvideo',
            '-s',str(width)+'*'+str(height),
            '-pix_fmt', 'bgr24',
            '-fflags', 'nobuffer',
            '-']

p = sp.Popen(command, stdout=sp.PIPE, bufsize=10**8)

while True:
    raw_image = p.stdout.read(width*height*3)
    image =  np.fromstring(raw_image, dtype='uint8')
    image = image.reshape((height,width,3)).copy()
    cv2.imshow('image', image)
    key = cv2.waitKey(20)

我想使用 gstreamer 命令而不是 ffmpeg。到目前為止,我已經實現了使用 gstreamer 命令行將字節流寫入文件。

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/video latency=0 drop-on-latency=true ! rtph264depay ! video/x-h264, stream-format='byte-stream' ! filesink location=/home/name/stdout

但是不能把output字節流轉pipe,所以終端不顯示字節流,不像ffmpeg命令。 如何通過pipe將此命令更改為output字節流,以便我可以從pipe讀取。感謝您抽空為我解答!

這是 RTSP 流代碼。

import cv2
import time
import subprocess as sp
import numpy as np


rtsp_url = 'rtsp://127.0.0.1:8554/video'
video_path = r'test.mp4'
cap = cv2.VideoCapture(video_path)

# Get video information
fps = int(cap.get(cv2.CAP_PROP_FPS))
width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
print('fps={}'.format(fps))

# command
command = ['ffmpeg',
            '-re',
            '-y',
            '-stream_loop', '-1',
            '-f', 'rawvideo',
            '-vcodec', 'rawvideo',
            '-pix_fmt', 'bgr24',
            '-s', "{}x{}".format(width, height),
            '-r', str(fps),
            '-i', '-',
            '-c:v', 'libx264',
            '-pix_fmt', 'yuv420p',
            '-preset', 'ultrafast',
            # '-flags2', 'local_header',
            '-bsf:v', "'dump_extra=freq=k'", 
            '-keyint_min', '60',
            '-g', '60',
            '-sc_threshold', '0', 
            '-f', 'rtsp',
            '-rtsp_transport', 'tcp',
            '-muxdelay', '0.1', 
            rtsp_url]

p = sp.Popen(command, stdin=sp.PIPE)

cnt = 0
t_start = time.time()
while (cap.isOpened()):
    t_cur = time.time()-t_start

    ret, frame = cap.read()
    if not ret:
        cnt += 1
        print("count: {}".format(cnt))
        cap = cv2.VideoCapture(video_path)
        continue

    p.stdin.write(frame.tobytes())

    cv2.imshow('real_time', frame)

    key = cv2.waitKey(20)
    if key == 27:
        p.terminate()
        break

我已經設法創建了一個適用於 Linux 的示例。

我無法模擬 RTSP 攝像機,所以我使用 MP4 文件作為輸入。

在 Python 中使用 FFmpeg CLI 創建 MP4 輸入文件(用於測試):

sp.run(shlex.split(f'ffmpeg -y -f lavfi -i testsrc=size={width}x{height}:rate=25:duration=100 -vcodec libx264 -pix_fmt yuv420p {input_file_name}'))

GStreamer 命令是:

p = sp.Popen(shlex.split(f'{gstreamer_exe} --quiet filesrc location={input_file_name}, qtdemux, video/x-h264. avdec_h264 ! videoconvert ! capsfilter caps="video/x-raw, format=BGR" ! filesink location={stdout_file_name}'), stdout=sp.PIPE)

  • 使用--quiet是因為 GStreamer 將消息打印到標准輸出。
  • filesrc location ... 用於讀取 MP4 輸入 - 用 RTSP 管道替換它。
  • videoconvert, capsfilter caps="video/x-raw, format=BGR"將視頻格式轉換為原始 BGR。
  • filesink location=/dev/stdout將 output 重定向到 stdout(在 Linux 中)。

代碼示例:

import cv2
import numpy as np
import subprocess as sp
import shlex
from sys import platform

width = 714
height = 420

input_file_name = 'input.mp4'  # For testing, use MP4 input file instead of RTSP input.

# Build MP4 synthetic input video file for testing:
sp.run(shlex.split(f'ffmpeg -y -f lavfi -i testsrc=size={width}x{height}:rate=25:duration=100 -vcodec libx264 -pix_fmt yuv420p {input_file_name}'))

if platform == "win32":
    # stdout_file_name = "con:"
    # gstreamer_exe = 'c:/gstreamer/1.0/msvc_x86_64/bin/gst-launch-1.0.exe'
    raise Exception('win32 system is not supported')
else:
    stdout_file_name = "/dev/stdout"
    gstreamer_exe = 'gst-launch-1.0'

# https://stackoverflow.com/questions/29794053/streaming-mp4-video-file-on-gstreamer
p = sp.Popen(shlex.split(f'{gstreamer_exe} --quiet filesrc location={input_file_name} ! qtdemux ! video/x-h264 ! avdec_h264 ! videoconvert ! capsfilter caps="video/x-raw, format=BGR" ! filesink location={stdout_file_name}'), stdout=sp.PIPE)

while True:
    raw_image = p.stdout.read(width * height * 3)

    if len(raw_image) < width*height*3:
        break

    image = np.frombuffer(raw_image, dtype='uint8').reshape((height, width, 3))
    cv2.imshow('image', image)
    key = cv2.waitKey(1)

p.stdout.close()
p.wait()
cv2.destroyAllWindows()

更新:

根據您的新問題,我設法創建了 RTSP 捕獲示例:

import cv2
import numpy as np
import subprocess as sp
import shlex

width = 240
height = 160

rtsp_url = 'rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4'  # For testing, use public RTSP input.

gstreamer_exe = 'gst-launch-1.0'  # '/usr/bin/gst-launch-1.0'

# https://stackoverflow.com/questions/29794053/streaming-mp4-video-file-on-gstreamer
p = sp.Popen(shlex.split(f'{gstreamer_exe} --quiet rtspsrc location={rtsp_url} ! queue2 ! rtph264depay ! avdec_h264 ! videoconvert ! capsfilter caps="video/x-raw, format=BGR" ! fdsink'), stdout=sp.PIPE)

while True:
    raw_image = p.stdout.read(width * height * 3)

    if len(raw_image) < width*height*3:
        break

    image = np.frombuffer(raw_image, np.uint8).reshape((height, width, 3))
    cv2.imshow('image', image)
    key = cv2.waitKey(1)

p.stdout.close()
p.wait()
cv2.destroyAllWindows()

在此處輸入圖像描述

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM