[英]iOS: AVPlayer - getting a snapshot of the current frame of a video
I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success. 我花了一整天时间,经历了很多SO答案,Apple参考文献,文档等,但都没有成功。
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage
. 我想要一个简单的事情: 我正在使用AVPlayer播放视频,我想暂停它并将当前帧作为
UIImage
。 That's it. 而已。
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer
without any problems. 我的视频是位于互联网上的m3u8文件,它在
AVPlayerLayer
中正常播放没有任何问题。
What have I tried: 我试过了什么:
AVAssetImageGenerator
. AVAssetImageGenerator
。 It is not working, the method copyCGImageAtTime:actualTime: error:
returns null image ref. copyCGImageAtTime:actualTime: error:
返回null image ref。 According to the answer here AVAssetImageGenerator
doesn't work for streaming videos. AVAssetImageGenerator
不适用于流式传输视频。 renderInContext:
on AVPlayerLayer
, but then I realized that it is not rendering this kind of "special" layers. AVPlayerLayer
上尝试了第一个renderInContext:
但后来我意识到它没有呈现这种“特殊”层。 Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates:
which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. drawViewHierarchyInRect:afterScreenUpdates:
它应该能够渲染特殊图层,但没有运气,仍然得到了带有空白黑色区域的UI快照,其中显示了视频。 AVPlayerItemVideoOutput
. AVPlayerItemVideoOutput
。 I have added a video output for my AVPlayerItem
, however whenever I call hasNewPixelBufferForItemTime:
it returns NO
. AVPlayerItem
添加了一个视频输出,但每当我调用hasNewPixelBufferForItemTime:
它返回NO
。 I guess the problem is again streaming video and I am not alone with this problem. AVAssetReader
. AVAssetReader
。 I was thinking to try it but decided not to lose time after finding a related question here . So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? 所以,有没有办法得到我现在在屏幕上看到的东西的快照? I can't believe this.
我简直不敢相信。
AVPlayerItemVideoOutput
works fine for me from an m3u8. 从m3u8开始,
AVPlayerItemVideoOutput
对我来说很好。 Maybe it's because I don't consult hasNewPixelBufferForItemTime
and simply call copyPixelBufferForItemTime
? 也许是因为我不咨询
hasNewPixelBufferForItemTime
而只是调用copyPixelBufferForItemTime
? This code produces a CVPixelBuffer
instead of a UIImage
, but there are answers that describe how to do that . 此代码生成
CVPixelBuffer
而不是UIImage
,但有些答案描述了如何执行此操作 。
This answer mostly cribbed from here 这个答案大多来自这里
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()
@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
@end
@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}
@end
AVAssetImageGenerator
is the best way to snapshot a video, this method return asynchronously a UIImage
: AVAssetImageGenerator
是对视频进行快照的最佳方式,此方法异步返回UIImage
:
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:@escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if image != nil {
handler(UIImage(cgImage: image!))
}
}
}
(It's Swift 4.2) (这是Swift 4.2)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.