简体   繁体   English

使用FFmpeg将Android屏幕捕获为视频文件

[英]Capture Android screen as a video file using FFmpeg

I am trying to capture an Android device's screen as a video file using FFmpeg with this command: 我正在尝试使用FFmpeg通过以下命令将Android设备的屏幕捕获为视频文件:

/data/local/ffmpeg -y -vcodec rawvideo -f rawvideo -pix_fmt rgb32 -s 320x480 -i /dev/graphics/fb0 /sdcard/output2.avi 2> /sdcard/out.txt

This creates a file with a single (unclear) frame and stops. 这将创建一个具有单个(不清楚)帧的文件并停止。

ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers
  built on Sep 20 2012 13:28:38 with gcc 4.6.x-google 20120106 (prerelease)
  configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/org.witness.sscvideoproto --enable-pic --disable-shared --enable-static --cross-prefix=/opt/android-ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/bin/arm-linux-androideabi- --sysroot=/opt/android-ndk/platforms/android-3/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon' --extra-ldflags=-L../x264 --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --enable-libfreetype --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib
  libavutil      51. 54.100 / 51. 54.100
  libavcodec     54. 23.100 / 54. 23.100
  libavformat    54.  6.100 / 54.  6.100
  libavdevice    54.  0.100 / 54.  0.100
  libavfilter     2. 77.100 /  2. 77.100
  libswscale      2.  1.100 /  2.  1.100
  libswresample   0. 15.100 /  0. 15.100
  libpostproc    52.  0.100 / 52.  0.100
[rawvideo @ 0xee5540] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from '/dev/graphics/fb0':
  Duration: N/A, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: rawvideo (BGRA / 0x41524742), bgra, 320x480, 25 tbr, 25 tbn, 25 tbc
[buffer @ 0xef16e0] w:320 h:480 pixfmt:bgra tb:1/25 sar:0/1 sws_param:flags=2
[buffersink @ 0xef1950] No opaque field provided
[format @ 0xef1a70] auto-inserting filter 'auto-inserted scaler 0' between the filter 'src' and the filter 'format'
[scale @ 0xef2c10] w:320 h:480 fmt:bgra sar:0/1 -> w:320 h:480 fmt:yuv420p sar:0/1 flags:0x4
Output #0, avi, to '/sdcard/output2.avi':
  Metadata:
    ISFT            : Lavf54.6.100
    Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p, 320x480, q=2-31, 200 kb/s, 25 tbn, 25 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo -> mpeg4)
Press [q] to stop, [?] for help
frame=    5 fps=0.0 q=5.4 Lsize=     199kB time=00:00:00.20 bitrate=8156.6kbits/s    
video:193kB audio:0kB global headers:0kB muxing overhead 2.929166%

Any idea what am I doing wrong? 知道我在做什么错吗?

http://www.mail-archive.com/android-porting@googlegroups.com/msg17709.html http://www.mail-archive.com/android-porting@googlegroups.com/msg17709.html

adb shell ioctl -rl 28 /dev/graphics/fb0 17920

last 4 bytes are bits_per_pixel 最后4个字节是bits_per_pixel

-i /dev/graphics/fb0 only read fb0 once, so you only get a single frame. -i / dev / graphics / fb0仅读取一次fb0,因此您只会得到一个帧。 ffmpeg won't read fb0 all time. ffmpeg不会一直读取fb0。 With command line, you won't get this feature. 使用命令行,您将不会获得此功能。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM