[英]How to get first frame of a video as the view loads?
As soon as View Loads I want to display a paused video, with its first frame on it. 一旦查看加载,我想显示暂停的视频,其第一帧就在上面。 For this I want to get the first frame of the video as a UImage to display till the video is played.
为此,我希望将视频的第一帧作为UImage显示,直到视频播放为止。
Is there anyway in iOS to get a frame of a video file kept locally via its URL ? 无论如何在iOS中通过其URL获取本地保存的视频文件的框架?
UPDATE : Please consider both My solution and Stavash's as well and see whichever suits you. 更新 :请同时考虑我的解决方案和Stavash,并查看适合您的方式。
I found a way that works fine as well ! 我找到了一种工作正常的方法! Checkout the Apple documentation on
AVAssetImageGenerator
class and AV Foundation Programming Guide . 查看有关
AVAssetImageGenerator
类和AV Foundation编程指南的Apple文档 。 It has a method called copyCGImageAtTime:actualTime:error:
它有一个名为
copyCGImageAtTime:actualTime:error:
的方法copyCGImageAtTime:actualTime:error:
Use AVasset
class of AVFoundation FrameWork
. 使用
AVasset
类AVFoundation FrameWork
AVasset
类。 AVasset has a class method + assetWithURL:
that gives you an asset object from URL of a file. AVasset有一个类方法
+ assetWithURL:
它从文件的URL为您提供资产对象。 this method is only available for iOS 5 or above, in case we can do a respondsToSelector and if it fails then use 此方法仅适用于iOS 5或更高版本,以防我们可以执行respondsToSelector,如果失败则使用
+ (AVURLAsset *)URLAssetWithURL:(NSURL *)URL options:(NSDictionary *)options
AVURLAsset is a concrete subclass of AVAsset and can be used when the URL of the local video file is available. AVURLAsset是AVAsset的具体子类,可在本地视频文件的URL可用时使用。
Once the asset is created, it has properties like naturalSize,duration and a lots of methods to work on media files. 创建资产后,它具有自然大小,持续时间和许多处理媒体文件的方法等属性。
use + assetImageGeneratorWithAsset:
of AVAssetImageGenerator
and pass the asset to it and create an Imagegenerator. 使用
+ assetImageGeneratorWithAsset:
的AVAssetImageGenerator
和资产传递给它创造一个Imagegenerator。
Call the above method to get a CGImageRef
and then use UIImage's imageWithCGImage to get the image and set it to UIImageView's image property! 调用上面的方法获取
CGImageRef
,然后使用UIImage的imageWithCGImage获取图像并将其设置为UIImageView的图像属性!
Plain and Simple ! 干净利落 !
NOTE: Don't forget to add the Core Media frameWork for CMTime you will pass to get the corresponding Image-Frame. 注意:不要忘记为CMTime添加Core Media frameWork,您将传递以获取相应的Image-Frame。
also add the AVFoundation FrameWork for all the AVFoundation classes. 还为所有AVFoundation类添加AVFoundation FrameWork。
I found the following code, it might help you: 我找到了以下代码,它可能对您有所帮助:
-(void) writeVideoFrameAtTime:(CMTime)time {
if (![videoWriterInput isReadyForMoreMediaData]) {
NSLog(@"Not ready for video data");
}
else {
@synchronized (self) {
UIImage* newFrame = [self.currentScreen retain];
CVPixelBufferRef pixelBuffer = NULL;
CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
if(status != 0){
//could not get a buffer from the pool
NSLog(@"Error creating pixel buffer: status=%d", status);
}
// set image data into pixel buffer
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels); //XXX: will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data
if(status == 0){
BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
if (!success)
NSLog(@"Warning: Unable to write buffer to video");
}
//clean up
[newFrame release];
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
CVPixelBufferRelease( pixelBuffer );
CFRelease(image);
CGImageRelease(cgImage);
}
}
}
Taken from http://codethink.no-ip.org/wordpress/archives/673#comment-8146 摘自http://codethink.no-ip.org/wordpress/archives/673#comment-8146
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.