简体   繁体   English

如何处理活的原始h264流通过网络发送

[英]how to deal with live raw h264 stream to send over network

what I want to do is that send live camera stream which is encoded by h264 to gstreamer. 我想要做的是将由h264编码的实时摄像机流发送到gstreamer。 I already have seen many example which send over network by using rtp and mpeg-ts. 我已经看过许多使用rtp和mpeg-ts通过网络发送的示例。 But problem is that all those examples assume that the input will be served by fixed file or live stream which is already transcoded in transport portocol like below. 但问题在于,所有这些示例都假设输入将由固定文件或实时流提供,该流已经在传输协议中进行转码,如下所示。

client : gst-launch-1.0 videotestsrc horizontal-speed=5 ! 客户端:gst-launch-1.0 videotestsrc horizo​​ntal-speed = 5! x264enc tune="zerolatency" threads=1 ! x264enc tune =“zerolatency”threads = 1! mpegtsmux ! mpegtsmux! tcpserversink host=192.168.0.211 port=8554 tcpserversink host = 192.168.0.211 port = 8554

server : gst-launch-1.0 tcpclientsrc port=8554 host=192.168.0.211 ! server:gst-launch-1.0 tcpclientsrc port = 8554 host = 192.168.0.211! tsdemux ! tsdemux! h264parse ! h264parse! avdec_h264 ! avdec_h264! xvimagesink xvimagesink

But, My camera offer the below interface (written in java, actually work on adnroid). 但是,我的相机提供了以下界面(用java编写,实际上在adnroid上工作)。 The interface offer just live raw h264 blocks. 该界面仅提供实时原始h264块。

mReceivedVideoDataCallBack=newDJIReceivedVideoDataCallBack(){
    @Override
    public void onResult(byte[] videoBuffer, int size)
    {
}

I can create tcp session to send those data block. 我可以创建tcp会话来发送那些数据块。 But, how can i make those data which is not packed in transport protocol into format which is understable by gstreamer tcpclient? 但是,如何将那些未在传输协议中打包的数据转换为gstreamer tcpclient可以理解的格式? Transcode the original stream in ts format in the camera side can be a solution. 在相机端以ts格式转码原始流可以是一种解决方案。 But i have no clue to do transcode from non-file and non-transport-format data. 但我不知道从非文件和非传输格式数据进行转码。 I have searched gstreamer and ffmpeg, But I could not derive a way to deal h264 block stream using the supported interface, unitl now. 我搜索过gstreamer和ffmpeg,但是我无法使用支持的接口unitl来处理h264块流。 Or, Are there any way to make gstreamer to directly accept those simple raw h264 block? 或者,有没有办法让gstreamer直接接受那些简单的原始h264块?

I think the best solution is to create your own element for your video source and then construct a pipeline using your element and mpegtsmux . 我认为最好的解决方案是为您的视频源创建自己的元素,然后使用您的元素和mpegtsmux构建管道。

However, you can use appsrc + mpegtsmux and feed your appsrc through JNI with buffers you have from callback. 但是,您可以使用appsrc + mpegtsmux并通过JNI将您的appsrc与回调中的缓冲区一起提供。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM