简体   繁体   中英

FFmpeg Javacv - Latency Issue

I am using an android v21 device to stream data to a javafx application. Its working fine but I have about 2 seconds of latency.

As of now the basic transportation goes like this

  1. android webrtc/custom implementation 16ms
  2. android packetizer(udp) 6 ms
  3. udp transport assumed at < 5ms
  4. windows depacketizer no buildup of data in buffers
  5. windows ffmpeg framgrabber unkown latency
  6. javafx imageview <1 ms

My data stream to my desktop and my packetizer is much faster than my frame rate and is often just waiting. There is no buildup of data anywhere else and therefore I assume no delay in any of my code.

I tested my android device by writing the yuv from camera to a texture and timing how long before the android device can encode the frame into h264 and then how long until its sent. so 16 + 6 = 22ms

I feel the problem is with the Javacv ffmpeg framegrabber. Im studying this api in order to learn why this is occurring.

My major concern is that framegrabber takes foever to start...around 4 seconds.

Once it start I can clearly see how many frames I insert and how many its grabbing and it always lagging by some large number such as 40 up to 200.

Also Framegrabber.grab() is blocking and runs every 100ms to match my frame rate no matter how fast I tell it to run so I can never catch up.

Do you have any suggestions?

Im starting to think javacv is not a viable solution because it seems many people struggle with this delay issue. If you have alternate suggestions please advise.

My ffmpeg framgrabber

    public RapidDecoder(final InputStream inputStream, final ImageView view)
{
    System.out.println(TAG + " starting");

     grabber = new FFmpegFrameGrabber(inputStream, 0);
     converter = new Java2DFrameConverter();
     mView = view;


    emptyBuffer = new Runnable() {
        @Override
        public void run() {
            System.out.println(TAG + " emptybuffer thread running");
            try {

                grabber.setFrameRate(12);
                grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                grabber.setOption("preset", "ultrafast");

                grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                grabber.setVideoOption("tune", "zerolatency");
                //grabber.setFrameNumber(60);


                grabber.start();
            }catch (Exception e)
            {
                System.out.println(TAG + e);
            }

            while (true)
            {

                try{
                    grabFrame();
                    Thread.sleep(1);
                }catch (Exception e)
                {
                    System.out.println(TAG + " emptybuffer " + e);
                }

            }



        }
    };

    display = new Runnable() {
        @Override
        public void run() {

            System.out.println(TAG + " display thread running ");

            while(true)
            {

                try{
                    displayImage();
                    Thread.sleep(10);
                }catch (Exception e)
                {
                    System.out.println(TAG + " display " + e);
                }

            }


        }
    };




}


public void generateVideo()
{
    System.out.println(TAG + " genvid ");




    new Thread(emptyBuffer).start();
    new Thread(display).start();



}



public synchronized void grabFrame() throws FrameGrabber.Exception
{
           //frame = grabber.grabFrame();
        frame = grabber.grab();
    //System.out.println("grab");


}

public synchronized void displayImage()
{


    bufferedImage = converter.convert(frame);
    frame = null;
    if (bufferedImage == null) return;
    mView.setImage(SwingFXUtils.toFXImage(bufferedImage, null));
    //System.out.println("display");
}

here you can see i draw texture with image and send to h264 encoder

@Override public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp) { //Log.d(TAG, "onTextureFrameCaptured: ->");

            VideoRenderer.I420Frame frame = new VideoRenderer.I420Frame(width, height, rotation, texId, tranformMatrix, 0,timestamp);
            avccEncoder.renderFrame(frame);
            videoView.renderFrame(frame);
            surfaceTextureHelper.returnTextureFrame();

        }

Here you can see webrtc encoding happen

 @Override
    public void renderFrame(VideoRenderer.I420Frame i420Frame) {
        start = System.nanoTime();
        bufferque++;

        mediaCodecHandler.post(new Runnable() {
            @Override
            public void run() {
                videoEncoder.encodeTexture(false, i420Frame.textureId, i420Frame.samplingMatrix, TimeUnit.NANOSECONDS.toMicros(i420Frame.timestamp));
            }
        });


    }

    /**
     * Called to retrieve an encoded frame
     */
    @Override
    public void onEncodedFrame(MediaCodecVideoEncoder.OutputBufferInfo frame, MediaCodec.BufferInfo bufferInfo) {

        b = new byte[frame.buffer().remaining()];
        frame.buffer().get(b);
        synchronized (lock)
        {
            encodedBuffer.add(b);
            lock.notifyAll();
            if(encodedBuffer.size() > 1)
            {
                Log.e(TAG, "drainEncoder: too big: " + encodedBuffer.size(),null );

            }
        }
        duration = System.nanoTime() - start;
        bufferque--;
        calcAverage();
        if (bufferque > 0)
        {
        Log.d(TAG, "onEncodedFrame: bufferque size: " + bufferque);


    }

}

I edited my question above as I solved the problem over the course of a few days but let me give detail for those who may need them.

Android - I ended up using this library https://github.com/Piasy/VideoCRE It tears the webrtc function open and allows you to encode video frame by frame. Thats how I benchmarked the frames at 16ms for encoding on an old terrible phone.

javacv ffmpeg -The solution was a buffering issue in the c++ avcodec. To prove it try feed every frame in twice or 10 times instead of once. It cuts down latency by the same factor although the feed becomes useless as well. It also reduces the startup time of the video feed. However on line 926 of ffmpegframegrabber in the javacv code I set thread from (0) to (1) per this link https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html

The thread_count = 0 directs x264 to use enough threads to load all your CPU cores during encode. So you probably run the tests on a dual core machine (2 cores will have 3 threads). To get x264 encode without delay, set thread_count = 1.

You may find countless suggestions of setting options through javacv however I never had javacv reject the options I set and learned many times that I was affecting the wrong factors. Here is a list of the thing i tried;

                //grabber.setFrameRate(12);
                //grabber.setVideoBitrate(10000);

                //grabber.setOption("g", "2");
               // grabber.setOption("bufsize", "10000");
                //grabber.setOption("af", "delay 20");
                //grabber.setNumBuffers(0);
                //grabber.setOption("flush_packets", "1");
                //grabber.setOption("probsize", "32");
                //grabber.setOption("analyzeduration", "0");
                //grabber.setOption("preset", "ultrafast");

                //grabber.setOption("fflags", "nobuffer");
                //grabber.setVideoOption("nobuffer", "1");
                //grabber.setOption("fflags", "discardcorrupt");
                //grabber.setOption("framedrop", "\\");
               //grabber.setOption("flags","low_delay");
                //grabber.setOption("strict","experimental");
                //grabber.setOption("avioflags", "direct");
                //grabber.setOption("filter:v", "fps=fps=30");
                //grabber.setOptions("look_ahead", "0");
                //Map options = new HashMap();
                //options.put("tune", "zerolatency");
                grabber.setVideoOption("look_ahead", "0");
                //grabber.setFrameNumber(60);

None of them worked and as you read the documentation you will understand that when ffmpeg starts up there are different encoders (avcontext, videocontext, audiocontext) which takes different values and there are different api framegrabber and ffply which takes different flags( i believe) so throwing things at the wall is rather futile.

Try adding the extra frames to your stream first. Also if you only need a single image just add a null packet to you input stream and it will flush the buffer.

If you need to stream video for robotic vision check out my blog post http://cagneymoreau.com/stream-video-android/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM