简体   繁体   中英

Capture video (Camera API) and simple use of MediaCodec

Lately I was working with MediaRecorder to capture videos and handle them in the output. However as it turns out, there were security restrictions, which didn't allow me to catch the outputstream from the MediaRecorder (the problem presented in the link below):

"Seekable" file descriptor to use with MediaRecorder Android 6.0 (API 23)

So I had to elaborate another solution and decided to work with Camer API and and get the stream there. So the first way was to work with onPreviewFrame, catch the frames in a file and convert colors and formats (MediaCodec). Luckely the problem with color conversion could be circumvented by getting the video from the eg SuraceTexture, as described eg in bigflakes project:

https://bigflake.com/mediacodec/CameraToMpegTest.java.txt

I am not a total newbie in Android Java, but this is really overwhelming me. I dont want a ready receipt for that and I am pretty okay with sitting and working the next whole week and cracking that code, but firstly my question is: how you guys got to understand MediaCodec taking the video from eg SurfaceTexture and later put it in MediaMuxer and secondly could you recommend some tutorials, where you begin with the simpliest project on that topic and then gradually expand the code?

I really try to work on bigflakes project, but I am helpless even because the onCreate method is missing.. and the best part begins when he begins to render the video.

Bigflakes MediaCodec page contains mostly tests for MediaCodec , if you still insist on using that as a reference then start from encodeCameraToMpeg() in CameraToMpegTest , also take a look at EncodeAndMux to get an idea on how to set up the MediaCodec encoder .

For a working video capture sample, take a look at taehwandev's MediaCodecExample . For an example on how to decode your recorded video, take a look at the BasicMediaDecode provided in the Google Samples repo.

The advantage of using MediaCodec along with Camera1 API would be that you'll be able to support devices with API level 18 and upwards. If you're only targeting API levels 21 and upwards, then Camera2 should work, here's a Android Camera2Video Sample for you to refer to if needed.

Finally, it might also be worthwhile to look at the new CameraX API, although it shouldn't be used in production yet, that's the direction that android's camera API is moving towards. So it's probably worth taking a look at the official documentation and going through a guide or two (eg: Exploring CameraX ) to get the basic idea ahead of time.

NOTE - Do not use CameraX API in production code yet , as the CameraX library is in alpha stage and its API surfaces aren't yet finalized. I merely provided it as an option for you keep tabs on for future reference.

So it is almost over one week and I followed Chris advice and worked with taehwandev's MediaCodecExample and then BasicMediaDecode , and understood this codes, althought the latter becomes much more complicated. However, in my opinion this does not help much with understanding Bigflakes CameraToMpegTest. I have to complain, becuase I am thrown into deep water and it seems that there is not much help besides self-deduction. How do you guys got to know about rendering in Android, where did you learn to use mediacodec together with CamerAPI. Are there any tutorials, books, learn materials to solve the on top asked question? This comment should take a form of an answer so: so I would propose to work with Grafikas show + capture , because this is at least implemented in an app environment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM