简体   繁体   中英

iOS ffmpeg how to run a command to trim remote url video?

I was initially using the AVFoundation libraries to trim video but it has a limitation that it can't do it for remote URLs and only works for local URLs.

So after further research I found ffmpeg library which can be included in a Xcode project for iOS. I have tested the following commands to trim a remote video on command line:

ffmpeg -y -ss 00:00:01.000 -i "http://i.imgur.com/gQghRNd.mp4" -t 00:00:02.000 -async 1 cut.mp4

which will trim the .mp4 from 1 second to 3 second mark. This works perfectly via command line on my mac.

I have been successfully able to compile and include ffmpeg library into a xcode project but not sure how to proceed further.

Now I am trying to figure out how to run this command on an iOS app using the ffmpeg libraries. How can I do this?

If you can point me to some helpful direction, I would really appreciate it! If I can get it resolved using your solution, I will award a bounty (in 2 days when it gives me the option).

I have some idea about this. However, I have very limited exp on iOS and not sure whether my thought is the best way:

As far as I know, generally it is impossible to run the cmd tools on iOS. Maybe you have to write some code linked to ffmpeg libs.

Here's all the jobs needed to do:

  1. Open input file and init some ffmpeg context.
  2. Get the video stream and seek to the timestamp you want. This may be complicated. See ffmpeg tutorial for some help, or check this to seek precisely and dealing with the troublesome key frames.
  3. Decode some frames. Until the frame match the end timestamp.
  4. Meanwhile with above, encode the frames to a new file as output.

The examples in ffmpeg source is very good to learn how to do this.

Some maybe useful codes:

av_register_all();
avformat_network_init();

AVFormatContext* fmt_ctx;
avformat_open_input(&fmt_ctx, "http://i.imgur.com/gQghRNd.mp4", NULL, NULL);

avformat_find_stream_info(fmt_ctx, NULL);

AVCodec* dec;
int video_stream_index = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &dec, 0);
AVCodecContext* dec_ctx = avcodec_alloc_context3(NULL);
avcodec_parameters_to_context(dec_ctx, fmt_ctx->streams[video_stream_index]->codecpar)
// If there is audio you need, it should be decoded/encoded too.

avcodec_open2(dec_ctx, dec, NULL);
// decode initiation done

av_seek_frame(fmt_ctx, video_stream_index, frame_target, AVSEEK_FLAG_FRAME);
// or av_seek_frame(fmt_ctx, video_stream_index, timestamp_target, AVSEEK_FLAG_ANY)
// and for most time, maybe you need AVSEEK_FLAG_BACKWARD, and skipping some following frames too.

AVPacket packet;
AVFrame* frame = av_frame_alloc();

int got_frame, frame_decoded;
while (av_read_frame(fmt_ctx, &packet) >= 0 && frame_decoded < second_needed * fps) {
    if (packet.stream_index == video_stream_index) {
        got_frame = 0;
        ret = avcodec_decode_video2(dec_ctx, frame, &got_frame, &packet);
        // This is old ffmpeg decode/encode API, will be deprecated later, but still working now.
        if (got_frame) {
            // encode frame here
        }
    }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM