[英]ARGB images to gstreamer pipeline
In the attempt of speeding up the animation of a series of matplotlib
produced images in a video, I want to convert the following (working) example to gstreamer
, to be able to use the hardware encoding features of the Raspberry Pi 2 GPU. 为了加快视频中由
matplotlib
生成的一系列图像的动画速度,我想将以下(工作)示例转换为gstreamer
,以便能够使用Raspberry Pi 2 GPU的硬件编码功能。
The starting point is the code below, which uses stdin
to stream the ARGB video frames to ffmpeg
. 起点是下面的代码,该代码使用
stdin
将ARGB视频帧传输到ffmpeg
。 How can I replace the command string to work with gst-launch-1.0
instead? 我如何替换命令字符串以代替
gst-launch-1.0
?
import numpy as np
import matplotlib.pylab as plt
import time
import subprocess
# Number of frames
nframes = 200
# Generate data
x = np.linspace(0, 100, num=nframes)
y = np.random.random_sample(np.size(x))
def testSubprocess(x, y):
start_time = time.time()
#set up the figure
fig = plt.figure(figsize=(15, 9))
canvas_width, canvas_height = fig.canvas.get_width_height()
# First frame
ax0 = plt.plot(x,y)
pointplot, = plt.plot(x[0], y[0], 'or')
def update(frame):
# your matplotlib code goes here
pointplot.set_data(x[frame],y[frame])
# Open an ffmpeg process
outf = 'testSubprocess.mp4'
cmdstring = ('ffmpeg',
'-y', '-r', '1', # overwrite, 1fps
'-s', '%dx%d' % (canvas_width, canvas_height), # size of image string
'-pix_fmt', 'argb', # format
'-f', 'rawvideo', '-i', '-', # tell ffmpeg to expect raw video from the pipe
'-vcodec', 'mpeg4', outf) # output encoding
p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE)
# Draw frames and write to the pipe
for frame in range(nframes):
# draw the frame
update(frame)
fig.canvas.draw()
# extract the image as an ARGB string
string = fig.canvas.tostring_argb()
# write to pipe
p.stdin.write(string)
# Finish up
p.communicate()
print("Movie written in %s seconds" % (time.time()-start_time))
if __name__ == "__main__":
testSubprocess(x, y)
The component to be used in the pipeline is omxh264enc
, but that can also be achieved as a second step once I understand how to feed data to a pipeline. 管道中使用的组件是
omxh264enc
,但是一旦我了解了如何将数据馈送到管道中,也可以将其作为第二步来实现。 An answer based on gst-python
is also completely acceptable. 基于
gst-python
的答案也是完全可以接受的。
"videoparse" looks like the key thing you want to use here, to tell GStreamer what format your input is. “ videoparse”看起来就像您要在此处使用的关键内容,用来告诉GStreamer输入的格式是什么。
gst-launch-1.0 fdsrc !
gst-launch-1.0 fdsrc! videoparse width=128 height=128 format=argb framerate=5/1 !
videoparse宽度= 128高度= 128格式= argb帧率= 5/1! videorate !
视频率! videoconvert !
视频转换! omxh264enc !
omxh264enc! h264parse !
h264parse! mp4mux !
mp4mux! filesink location=OUTPUTFILE
filesink location =输出文件
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.