简体   繁体   中英

Stream low latency RTSP video to android with ffmpeg

I am trying to stream live webcam video from Ubuntu 12.04 PC to android device with KitKat. So far I've written ffserver config file to receive ffm feed and broadcast it through a rtsp protocol. I am able to watch the stream on the other computer in the same LAN with ffplay.

How to watch the stream on the android device? The following code works well when the webcam image is streamed with vlc but it doesn't with ffmpeg:

public class MainActivity extends Activity implements MediaPlayer.OnPreparedListener,
        SurfaceHolder.Callback {

    final static String RTSP_URL = "rtsp://192.168.1.54:4424/test.sdp";

    private MediaPlayer _mediaPlayer;
    private SurfaceHolder _surfaceHolder;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        // Set up a full-screen black window.
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        Window window = getWindow();
        window.setFlags(
                WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        window.setBackgroundDrawableResource(android.R.color.black);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        setContentView(R.layout.activity_main);

        // Configure the view that renders live video.
        SurfaceView videoView =
                (SurfaceView) findViewById(R.id.videoView); //where R.id.videoView is a simple SurfaceView element in the layout xml file
        _surfaceHolder = videoView.getHolder();
        _surfaceHolder.addCallback(this);
        _surfaceHolder.setFixedSize(320, 240);
    }
    @Override
    public void surfaceCreated(SurfaceHolder surfaceHolder) {
        _mediaPlayer = new MediaPlayer();
        _mediaPlayer.setDisplay(_surfaceHolder);
        Context context = getApplicationContext();
        Uri source = Uri.parse(RTSP_URL);
        try {
            // Specify the IP camera's URL and auth headers.
            _mediaPlayer.setDataSource(context, source);

            // Begin the process of setting up a video stream.
            _mediaPlayer.setOnPreparedListener(this);
            _mediaPlayer.prepareAsync();
        }
        catch (Exception e) {}
    }
    @Override
    public void onPrepared(MediaPlayer mediaPlayer) {
        _mediaPlayer.start();
    }
}

My ffserver.config file:

HTTPPort 8090
RTSPBindAddress 0.0.0.0
RTSPPort 4424
MaxBandwidth 10000
CustomLog -

<Feed feed1.ffm>
        File /tmp/feed1.ffm
        FileMaxSize 20M
        ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
    Feed feed1.ffm
    Format rtp  
    VideoCodec libx264
    VideoSize 640x480
    AVOptionVideo flags +global_header
    AVOptionVideo me_range 16
    AVOptionVideo qdiff 4
    AVOptionVideo qmin 10
    AVOptionVideo qmax 51
    Noaudio
    ACL allow localhost
        ACL allow 192.168.0.0 192.168.255.255
</Stream>

I am starting the stream with this command: ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -b:v 600k http://localhost:8090/feed1.ffm

This error could be most likely caused by different encoding parameters for VLC and FFmpeg - VLC could use encoding parameters that Android is able support, but FFmpeg could use unsupported ones (most likely AVC profile and level). Try to force baseline or main profile and YUV 4:2:0 pixel format through the FFmpeg command line options and ffserver.config.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM