I have an original video file "inputVideoFile.mp4"
I use ffmpeg to decode the input video file, process each frame, and then encode those frames to "outputVideoFile.mp4".
I do not get the delayed frames, which means the output video file should have less frames than the input video file.
However, when I use ffprob to see how many frames are in the two files, it shows their duration values are the same:
$ffprobe inputVideoFile.mp4
Duration: 00:00:04.08, start: 0.000000, birate: 7835 kb/s
$ffprobe outputVideoFile.mp4
Duration: 00:00:04.08, start: 0.000000, bitrate: 21055 kb/s
Why is it the case? And BTW, what units for the one after second, in 00:00:04.08? in ".08", is the unit 1/60 sec, or 1/100 sec?
Thanks!
I find the answer:
av_guess_frame_rate(a_AVFormatContext, a_AVStream, NULL)
It works!
No, it is not accurate. It is a guess based on average bitrate and filesize. Obviously, filesize is always accurate, but in some cases average bitrate is not properly recorded. I don't know, however, if this is a fundamental problem of the file format or a problem caused by the system/program that created the video file. For example, with Mythtv recordings, with the NuppelVideo formatted recordings, FFMPEG estimate would be off by 2-3x, that is, it estimates 1 hr recordings to be 2 or 3 hrs long.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.