Currently I am receiving video stream(H264 encoded buffer) and audio stream(PCMU encoded buffer) from remote end from which I can decode and render these as audio and video. Now I want to provide some APIs like -
string fileName = "dir/dir2/..../rec.mp4";
startRecord()
stopRecord()
User can start recording from any time and stop recording and the video & audio stream will be written as combined mp4 file. I can use ffmpeg
by which I can merge a .h264
and .wav
file as .mp4
file. But I want to do it programmatically directly from streams(not .h264 or .wav file) using any library or write my own. Is it possible?
See this answer for details. However, mp4
doesn't support G.711 PCM mu-law encoded data, either avi
or mov
can be used or trans-code the data from pcm to aac will work.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.