简体   繁体   English

FFmpeg Javacv - 延迟问题

[英]FFmpeg Javacv - Latency Issue

I am using an android v21 device to stream data to a javafx application.我正在使用 android v21 设备将数据流式传输到 javafx 应用程序。 Its working fine but I have about 2 seconds of latency.它工作正常,但我有大约 2 秒的延迟。

As of now the basic transportation goes like this截至目前,基本交通是这样的

  1. android webrtc/custom implementation 16ms android webrtc/自定义实现 16ms
  2. android packetizer(udp) 6 ms android 打包器(udp) 6 ms
  3. udp transport assumed at < 5ms udp 传输假设 < 5ms
  4. windows depacketizer no buildup of data in buffers windows depacketizer 缓冲区中没有数据积累
  5. windows ffmpeg framgrabber unkown latency Windows ffmpeg framgrabber 未知延迟
  6. javafx imageview <1 ms javafx 图像视图 <1 毫秒

My data stream to my desktop and my packetizer is much faster than my frame rate and is often just waiting.我的数据流到我的桌面和我的分包器比我的帧速率快得多,而且通常只是在等待。 There is no buildup of data anywhere else and therefore I assume no delay in any of my code.其他任何地方都没有数据积累,因此我假设我的任何代码都没有延迟。

I tested my android device by writing the yuv from camera to a texture and timing how long before the android device can encode the frame into h264 and then how long until its sent.我测试了我的 android 设备,方法是将 yuv 从相机写入纹理,并在 android 设备将帧编码为 h264 之前的时间以及发送之前的时间进行计时。 so 16 + 6 = 22ms所以 16 + 6 = 22 毫秒

I feel the problem is with the Javacv ffmpeg framegrabber.我觉得问题出在 Javacv ffmpeg framegrabber 上。 Im studying this api in order to learn why this is occurring.我正在研究这个 api 以了解为什么会发生这种情况。

My major concern is that framegrabber takes foever to start...around 4 seconds.我主要担心的是 framegrabber 需要 foever 才能启动……大约 4 秒。

Once it start I can clearly see how many frames I insert and how many its grabbing and it always lagging by some large number such as 40 up to 200.一旦它开始,我就可以清楚地看到我插入了多少帧以及它抓取了多少帧,并且它总是滞后一些大的数字,例如 40 到 200。

Also Framegrabber.grab() is blocking and runs every 100ms to match my frame rate no matter how fast I tell it to run so I can never catch up.此外,无论我告诉它运行多快,Framegrabber.grab() 都会阻塞并每 100 毫秒运行一次以匹配我的帧速率,这样我就永远无法赶上。

Do you have any suggestions?你有什么建议吗?

Im starting to think javacv is not a viable solution because it seems many people struggle with this delay issue.我开始认为 javacv 不是一个可行的解决方案,因为似乎很多人都在为这个延迟问题而苦苦挣扎。 If you have alternate suggestions please advise.如果您有其他建议,请提出建议。

My ffmpeg framgrabber我的 ffmpeg framgrabber

    public RapidDecoder(final InputStream inputStream, final ImageView view)
{
    System.out.println(TAG + " starting");

     grabber = new FFmpegFrameGrabber(inputStream, 0);
     converter = new Java2DFrameConverter();
     mView = view;


    emptyBuffer = new Runnable() {
        @Override
        public void run() {
            System.out.println(TAG + " emptybuffer thread running");
            try {

                grabber.setFrameRate(12);
                grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                grabber.setOption("preset", "ultrafast");

                grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                grabber.setVideoOption("tune", "zerolatency");
                //grabber.setFrameNumber(60);


                grabber.start();
            }catch (Exception e)
            {
                System.out.println(TAG + e);
            }

            while (true)
            {

                try{
                    grabFrame();
                    Thread.sleep(1);
                }catch (Exception e)
                {
                    System.out.println(TAG + " emptybuffer " + e);
                }

            }



        }
    };

    display = new Runnable() {
        @Override
        public void run() {

            System.out.println(TAG + " display thread running ");

            while(true)
            {

                try{
                    displayImage();
                    Thread.sleep(10);
                }catch (Exception e)
                {
                    System.out.println(TAG + " display " + e);
                }

            }


        }
    };




}


public void generateVideo()
{
    System.out.println(TAG + " genvid ");




    new Thread(emptyBuffer).start();
    new Thread(display).start();



}



public synchronized void grabFrame() throws FrameGrabber.Exception
{
           //frame = grabber.grabFrame();
        frame = grabber.grab();
    //System.out.println("grab");


}

public synchronized void displayImage()
{


    bufferedImage = converter.convert(frame);
    frame = null;
    if (bufferedImage == null) return;
    mView.setImage(SwingFXUtils.toFXImage(bufferedImage, null));
    //System.out.println("display");
}

here you can see i draw texture with image and send to h264 encoder在这里你可以看到我用图像绘制纹理并发送到 h264 编码器

@Override public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp) { //Log.d(TAG, "onTextureFrameCaptured: ->"); @Override public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp) { //Log.d(TAG, "onTextureFrameCaptured: ->");

            VideoRenderer.I420Frame frame = new VideoRenderer.I420Frame(width, height, rotation, texId, tranformMatrix, 0,timestamp);
            avccEncoder.renderFrame(frame);
            videoView.renderFrame(frame);
            surfaceTextureHelper.returnTextureFrame();

        }

Here you can see webrtc encoding happen在这里你可以看到 webrtc 编码发生

 @Override
    public void renderFrame(VideoRenderer.I420Frame i420Frame) {
        start = System.nanoTime();
        bufferque++;

        mediaCodecHandler.post(new Runnable() {
            @Override
            public void run() {
                videoEncoder.encodeTexture(false, i420Frame.textureId, i420Frame.samplingMatrix, TimeUnit.NANOSECONDS.toMicros(i420Frame.timestamp));
            }
        });


    }

    /**
     * Called to retrieve an encoded frame
     */
    @Override
    public void onEncodedFrame(MediaCodecVideoEncoder.OutputBufferInfo frame, MediaCodec.BufferInfo bufferInfo) {

        b = new byte[frame.buffer().remaining()];
        frame.buffer().get(b);
        synchronized (lock)
        {
            encodedBuffer.add(b);
            lock.notifyAll();
            if(encodedBuffer.size() > 1)
            {
                Log.e(TAG, "drainEncoder: too big: " + encodedBuffer.size(),null );

            }
        }
        duration = System.nanoTime() - start;
        bufferque--;
        calcAverage();
        if (bufferque > 0)
        {
        Log.d(TAG, "onEncodedFrame: bufferque size: " + bufferque);


    }

}

I edited my question above as I solved the problem over the course of a few days but let me give detail for those who may need them.我在几天内解决了问题时编辑了上面的问题,但让我为可能需要它们的人提供详细信息。

Android - I ended up using this library https://github.com/Piasy/VideoCRE It tears the webrtc function open and allows you to encode video frame by frame. Android - 我最终使用了这个库https://github.com/Piasy/VideoCRE它撕裂了 webrtc 功能并允许您逐帧编码视频。 Thats how I benchmarked the frames at 16ms for encoding on an old terrible phone.这就是我在 16 毫秒时对帧进行基准测试以在旧的糟糕手机上进行编码的方式。

javacv ffmpeg -The solution was a buffering issue in the c++ avcodec. javacv ffmpeg - 解决方案是 c++ avcodec 中的缓冲问题。 To prove it try feed every frame in twice or 10 times instead of once.为了证明它,尝试两次或 10 次而不是一次输入每一帧。 It cuts down latency by the same factor although the feed becomes useless as well.尽管提要也变得无用,但它以相同的因素减少了延迟。 It also reduces the startup time of the video feed.它还减少了视频源的启动时间。 However on line 926 of ffmpegframegrabber in the javacv code I set thread from (0) to (1) per this link https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html然而,在javacv代码中ffmpegframegrabber的第926行我将线程从(0)设置为(1)每个链接https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html

The thread_count = 0 directs x264 to use enough threads to load all your CPU cores during encode. thread_count = 0 指示 x264 使用足够的线程在编码期间加载所有 CPU 内核。 So you probably run the tests on a dual core machine (2 cores will have 3 threads).因此,您可能在双核机器上运行测试(2 个内核将有 3 个线程)。 To get x264 encode without delay, set thread_count = 1.要立即获得 x264 编码,请设置 thread_count = 1。

You may find countless suggestions of setting options through javacv however I never had javacv reject the options I set and learned many times that I was affecting the wrong factors.您可能会发现无数通过 javacv 设置选项的建议,但是我从未让 javacv 拒绝我设置的选项,并且多次了解到我正在影响错误的因素。 Here is a list of the thing i tried;这是我尝试过的事情的清单;

                //grabber.setFrameRate(12);
                //grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                //grabber.setOption("preset", "ultrafast");

                //grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                //grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                //grabber.setOptions("look_ahead", "0");
                //Map options = new HashMap();
                //options.put("tune", "zerolatency");
                grabber.setVideoOption("look_ahead", "0");
                //grabber.setFrameNumber(60);

None of them worked and as you read the documentation you will understand that when ffmpeg starts up there are different encoders (avcontext, videocontext, audiocontext) which takes different values and there are different api framegrabber and ffply which takes different flags( i believe) so throwing things at the wall is rather futile.它们都不起作用,当您阅读文档时,您会明白当 ffmpeg 启动时,有不同的编码器(avcontext、videocontext、audiocontext)采用不同的值,并且有不同的 api framegrabber 和 ffply 采用不同的标志(我相信)所以往墙上扔东西是徒劳的。

Try adding the extra frames to your stream first.首先尝试将额外的帧添加到您的流中。 Also if you only need a single image just add a null packet to you input stream and it will flush the buffer.此外,如果您只需要一个图像,只需向输入流添加一个空包,它就会刷新缓冲区。

If you need to stream video for robotic vision check out my blog post http://cagneymoreau.com/stream-video-android/如果您需要流式传输机器人视觉视频,请查看我的博客文章http://cagneymoreau.com/stream-video-android/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM