简体   繁体   English

swift:如何截取 AVPlayerLayer() 的截图

[英]swift: How to take screenshot of AVPlayerLayer()

How to take screenshot of AVplayerLayer.如何截取 AVplayerLayer 的截图。 I tried with the following code it works well, it captures the entire view as it was我尝试使用以下代码效果很好,它可以按原样捕获整个视图

func screenShotMethod() {
    let window = UIApplication.shared.delegate!.window!!
    //capture the entire window into an image
    UIGraphicsBeginImageContextWithOptions(window.bounds.size, false, UIScreen.main.scale)
    window.drawHierarchy(in: window.bounds, afterScreenUpdates: false)
    let windowImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    //now position the image x/y away from the top-left corner to get the portion we want
    UIGraphicsBeginImageContext(view.frame.size)
    windowImage?.draw(at: CGPoint(x: -view.frame.origin.x, y: -view.frame.origin.y))
    let croppedImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext();
    //embed image in an imageView, supports transforms.
    let resultImageView = UIImageView(image: croppedImage)
    UIImageWriteToSavedPhotosAlbum(croppedImage, nil, nil, nil)
}

But the problem is when i tried the same code running on iPhone(device) it returns black image.i don't know what was wrong但问题是当我尝试在 iPhone(设备)上运行相同的代码时,它返回黑色图像。我不知道出了什么问题

Any suggestions would be greatly helpful!任何建议都会非常有帮助!

Here is the code that is working for me in swift 4:这是在 swift 4 中对我有用的代码:

var videoImage = UIImage()

if let url = (player.currentItem?.asset as? AVURLAsset)?.url {

      let asset = AVAsset(url: url)

      let imageGenerator = AVAssetImageGenerator(asset: asset)
      imageGenerator.requestedTimeToleranceAfter = CMTime.zero
      imageGenerator.requestedTimeToleranceBefore = CMTime.zero

      if let thumb: CGImage = try? imageGenerator.copyCGImage(at: player.currentTime(), actualTime: nil) {
            //print("video img successful")
            videoImage = UIImage(cgImage: thumb)
       }

}

A few days ago, we also ran into the same issue.几天前,我们也遇到了同样的问题。 Where, if we take the screenshot of a screen which has a video player in it;哪里,如果我们截取一个有视频播放器的屏幕截图; The screenshot looks fine in the simulator.But, on the device, it was a black screen.屏幕截图在模拟器中看起来不错。但是,在设备上,它是黑屏。

After a lot of attempts, I failed and finally end up with a patch (not sure if it is a correct way of solving the problem).经过多次尝试,我失败了,最终得到了一个补丁(不确定是否是解决问题的正确方法)。 But, the solution did the trick and I was able to get the screenshot on the device as well and simulator as well.但是,该解决方案成功了,我也能够在设备和模拟器上获得屏幕截图。

Following is a way I used to solve the issue.以下是我用来解决问题的方法。

1 -> Get a single frame at current time from the video (Public method is already available for this) 1 -> 从视频中获取当前时间的单帧(公共方法已经可用)

2 -> Use this thumbnail in the place of CALayer (add it to hierarchy) 2 -> 使用此缩略图代替 CALayer(将其添加到层次结构)

3 -> Once we are done remove the thumbnail from the memory (remove from hierarchy) 3 -> 一旦我们完成从内存中删除缩略图(从层次结构中删除)

Following is a demo sample for the same (The given solution is in Objective-c though the question asked is in Swift).以下是相同的演示示例(给定的解决方案在 Objective-c 中,尽管提出的问题是在 Swift 中)。

Objective - C solution目标-C 解决方案

  - (void)SnapShot {
       UIImage *capturedImage = [self getASnapShotWithAVLayer];
    }
    - (UIImage *)getASnapShotWithAVLayer {
        //Add temporary thumbnail One
        UIImageView *temporaryViewForVideoOne = [[UIImageView alloc] initWithFrame:self.videoViewOne.bounds];
        temporaryViewForVideoOne.contentMode = UIViewContentModeScaleAspectFill;
        UIImage *imageFromCurrentTimeForVideoOne = [self takeVideoSnapShot:_playerItem1];
        int orientationFromVideoForVideoOne = [self getTheActualOrientationOfVideo:self.playerItem1];
        if(orientationFromVideoForVideoOne == 0)
        {
            orientationFromVideoForVideoOne = 3;
        }
        else if (orientationFromVideoForVideoOne == 90)
        {
            orientationFromVideoForVideoOne = 0;
        }
        imageFromCurrentTimeForVideoOne =
        [UIImage imageWithCGImage:[imageFromCurrentTimeForVideoOne CGImage]
                            scale:[imageFromCurrentTimeForVideoOne scale]
                      orientation: orientationFromVideoForVideoOne];
        UIImage *rotatedImageFromCurrentContextForVideoOne = [self normalizedImage:imageFromCurrentTimeForVideoOne];
        temporaryViewForVideoOne.clipsToBounds = YES;
        temporaryViewForVideoOne.image = rotatedImageFromCurrentContextForVideoOne;
        [self.videoViewOne addSubview:temporaryViewForVideoOne];
        CGSize imageSize = CGSizeZero;
        UIInterfaceOrientation orientation = [[UIApplication sharedApplication] statusBarOrientation];
        if (UIInterfaceOrientationIsPortrait(orientation)) {
            imageSize = [UIScreen mainScreen].bounds.size;
        } else {
            imageSize = CGSizeMake([UIScreen mainScreen].bounds.size.height, [UIScreen mainScreen].bounds.size.width);
        }

        UIGraphicsBeginImageContextWithOptions(imageSize, NO, [[UIScreen mainScreen] scale]);
        CGContextRef context = UIGraphicsGetCurrentContext();
        for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
            CGContextSaveGState(context);
            CGContextTranslateCTM(context, window.center.x, window.center.y);
            CGContextConcatCTM(context, window.transform);
            CGContextTranslateCTM(context, -window.bounds.size.width * window.layer.anchorPoint.x, -window.bounds.size.height * window.layer.anchorPoint.y);
            if (orientation == UIInterfaceOrientationLandscapeLeft) {
                CGContextRotateCTM(context, M_PI_2);
                CGContextTranslateCTM(context, 0, -imageSize.width);
            } else if (orientation == UIInterfaceOrientationLandscapeRight) {
                CGContextRotateCTM(context, -M_PI_2);
                CGContextTranslateCTM(context, -imageSize.height, 0);
            } else if (orientation == UIInterfaceOrientationPortraitUpsideDown) {
                CGContextRotateCTM(context, M_PI);
                CGContextTranslateCTM(context, -imageSize.width, -imageSize.height);
            }
            if (![window respondsToSelector:@selector(drawViewHierarchyInRect:afterScreenUpdates:)]) {
                [window drawViewHierarchyInRect:window.bounds afterScreenUpdates:YES];
            } else {
                [window drawViewHierarchyInRect:window.bounds afterScreenUpdates:YES];
            }
            CGContextRestoreGState(context);
        }
        UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        [temporaryViewForVideoOne removeFromSuperview];
        [temporaryViewForVideoTwo removeFromSuperview];
        return image;
    }
    -(UIImage *)takeVideoSnapShot: (AVPlayerItem *) playerItem{
        AVURLAsset *asset = (AVURLAsset *) playerItem.asset;
        AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
        imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
        imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
        CGImageRef thumb = [imageGenerator copyCGImageAtTime:playerItem.currentTime
                                                  actualTime:NULL
                                                       error:NULL];
        UIImage *videoImage = [UIImage imageWithCGImage:thumb];
        CGImageRelease(thumb);
        return videoImage;
    }
    -(int)getTheActualOrientationOfVideo:(AVPlayerItem *)playerItem
    {
        AVAsset *asset = playerItem.asset;
        NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
        AVAssetTrack *track = [tracks objectAtIndex:0];
        CGAffineTransform videoAssetOrientation_ = [track preferredTransform];
        CGFloat videoAngle  = RadiansToDegrees(atan2(videoAssetOrientation_.b, videoAssetOrientation_.a));
        int  orientation = 0;
        switch ((int)videoAngle) {
            case 0:
                orientation = UIImageOrientationRight;
                break;
            case 90:
                orientation = UIImageOrientationUp;
                break;
            case 180:
                orientation = UIImageOrientationLeft;
                break;
            case -90:
                orientation = UIImageOrientationDown;
                break;
            default:
                //Not found
                break;
        }
        return orientation;
    }
    - (UIImage *)normalizedImage:(UIImage *)imageOf {
        if (imageOf.imageOrientation == UIImageOrientationUp) return imageOf;

        UIGraphicsBeginImageContextWithOptions(imageOf.size, NO, imageOf.scale);
        [imageOf drawInRect:(CGRect){0, 0, imageOf.size}];
        UIImage *normalizedImage = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return normalizedImage;
    }

Swift solution迅捷解决方案

func snapShot() {
    let capturedImage: UIImage? = getASnapShotWithAVLayer()
}

func getASnapShotWithAVLayer() -> UIImage {
    //Add temporary thumbnail One
    let temporaryViewForVideoOne = UIImageView(frame: videoViewOne.bounds) //replace videoViewOne with you view which is showing AVPlayerContent
    temporaryViewForVideoOne.contentMode = .scaleAspectFill
    var imageFromCurrentTimeForVideoOne: UIImage? = takeVideoSnapShot(playerItem1)
    var orientationFromVideoForVideoOne: Int = getTheActualOrientationOfVideo(playerItem1)
    if orientationFromVideoForVideoOne == 0 {
        orientationFromVideoForVideoOne = 3
    }
    else if orientationFromVideoForVideoOne == 90 {
        orientationFromVideoForVideoOne = 0
    }

    imageFromCurrentTimeForVideoOne = UIImage(cgImage: imageFromCurrentTimeForVideoOne?.cgImage, scale: imageFromCurrentTimeForVideoOne?.scale, orientation: orientationFromVideoForVideoOne)
    let rotatedImageFromCurrentContextForVideoOne: UIImage? = normalizedImage(imageFromCurrentTimeForVideoOne)
    temporaryViewForVideoOne.clipsToBounds = true
    temporaryViewForVideoOne.image = rotatedImageFromCurrentContextForVideoOne
    videoViewOne.addSubview(temporaryViewForVideoOne) //Replace videoViewOne with your view containing AVPlayer
    var imageSize = CGSize.zero
    let orientation: UIInterfaceOrientation = UIApplication.shared.statusBarOrientation
    if UIInterfaceOrientationIsPortrait(orientation) {
        imageSize = UIScreen.main.bounds.size
    }
    else {
        imageSize = CGSize(width: CGFloat(UIScreen.main.bounds.size.height), height: CGFloat(UIScreen.main.bounds.size.width))
    }
    UIGraphicsBeginImageContextWithOptions(imageSize, false, UIScreen.main.scale())
    let context: CGContext? = UIGraphicsGetCurrentContext()
    for window: UIWindow in UIApplication.shared.windows {
        context.saveGState()
        context.translateBy(x: window.center.x, y: window.center.y)
        context.concatenate(window.transform)
        context.translateBy(x: -window.bounds.size.width * window.layer.anchorPoint.x, y: -window.bounds.size.height * window.layer.anchorPoint.y)
        if orientation == .landscapeLeft {
            context.rotate(by: M_PI_2)
            context.translateBy(x: 0, y: -imageSize.width)
        }
        else if orientation == .landscapeRight {
            context.rotate(by: -M_PI_2)
            context.translateBy(x: -imageSize.height, y: 0)
        }
        else if orientation == .portraitUpsideDown {
            context.rotate(by: .pi)
            context.translateBy(x: -imageSize.width, y: -imageSize.height)
        }

        if !window.responds(to: Selector("drawViewHierarchyInRect:afterScreenUpdates:")) {
            window.drawHierarchy(in: window.bounds, afterScreenUpdates: true)
        }
        else {
            window.drawHierarchy(in: window.bounds, afterScreenUpdates: true)
        }
        context.restoreGState()
    }
    let image: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    temporaryViewForVideoOne.removeFromSuperview()
    return image!
}

func takeVideoSnapShot(_ playerItem: AVPlayerItem) -> UIImage {
    let asset: AVURLAsset? = (playerItem.asset as? AVURLAsset)
    let imageGenerator = AVAssetImageGenerator(asset)
    imageGenerator.requestedTimeToleranceAfter = kCMTimeZero
    imageGenerator.requestedTimeToleranceBefore = kCMTimeZero
    let thumb: CGImageRef? = try? imageGenerator.copyCGImage(atTime: playerItem.currentTime(), actualTime: nil)
    let videoImage = UIImage(cgImage: thumb)
    CGImageRelease(thumb)
    return videoImage
}

func getTheActualOrientationOfVideo(_ playerItem: AVPlayerItem) -> Int {
    let asset: AVAsset? = playerItem.asset
    let tracks: [Any]? = asset?.tracks(withMediaType: AVMediaTypeVideo)
    let track: AVAssetTrack? = (tracks?[0] as? AVAssetTrack)
    let videoAssetOrientation_: CGAffineTransform? = track?.preferredTransform
    let videoAngle: CGFloat? = RadiansToDegrees(atan2(videoAssetOrientation_?.b, videoAssetOrientation_?.a))
    var orientation: Int = 0
    switch Int(videoAngle) {
        case 0:
            orientation = .right
        case 90:
            orientation = .up
        case 180:
            orientation = .left
        case -90:
            orientation = .down
        default:
            //Not found
    }
    return orientation
}

func normalizedImage(_ imageOf: UIImage) -> UIImage {
    if imageOf.imageOrientation == .up {
        return imageOf
    }
    UIGraphicsBeginImageContextWithOptions(imageOf.size, false, imageOf.scale)
    imageOf.draw(in: (CGRect))
    let normalizedImage: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    return normalizedImage!
}

Here is code for taking a screenshot of your AVPlayer including any UI behind it you also want on the screenshot.这是用于截取您的 AVPlayer 屏幕截图的代码,包括您在屏幕截图上也需要的任何 UI。

func takeScreenshot() -> UIImage? {
    //1 Hide all UI you do not want on the screenshot
    self.hideButtonsForScreenshot()

    //2 Create an screenshot from your AVPlayer
    if let url = (self.overlayPlayer?.currentItem?.asset as? AVURLAsset)?.url {

          let asset = AVAsset(url: url)

          let imageGenerator = AVAssetImageGenerator(asset: asset)
          imageGenerator.requestedTimeToleranceAfter = CMTime.zero
          imageGenerator.requestedTimeToleranceBefore = CMTime.zero

        if let thumb: CGImage = try? imageGenerator.copyCGImage(at: self.overlayPlayer!.currentTime(), actualTime: nil) {
            let videoImage = UIImage(cgImage: thumb)
            //Note: create an image view on top of you videoPlayer in the exact dimensions, and display it before taking the screenshot
            // mine is created in the storyboard
            // 3 Put the image from the screenshot in your screenshotPhotoView and unhide it
            self.screenshotPhotoView.image = videoImage
            self.screenshotPhotoView.isHidden = false
        }
    }
    
    //4 Take the screenshot
    let bounds = UIScreen.main.bounds
    UIGraphicsBeginImageContextWithOptions(bounds.size, true, 0.0)
    self.view.drawHierarchy(in: bounds, afterScreenUpdates: true)
    let image = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    
    //5 show all UI again that you didn't want on your screenshot
    self.showButtonsForScreenshot()
    //6 Now hide the screenshotPhotoView again
    self.screenshotPhotoView.isHidden = true
    self.screenshotPhotoView.image = nil
    return image
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM