簡體   English   中英

如何使用UIGetScreenImage從一組圖像制作電影

[英]How to make a movie from set of images using UIGetScreenImage

我已經使用這種方法並獲得了多個圖像。 我能夠成功創建電影,但是我的問題是,當我播放電影時,它似乎播放得太快,即電影沒有所有幀。 這是我的代碼。

-(UIImage *)uiImageScreen
{
  CGImageRef screen = UIGetScreenImage();
  UIImage* image = [UIImage imageWithCGImage:screen];
  CGImageRelease(screen);
  UIImageWriteToSavedPhotosAlbum(image, self,nil, nil);
  return image;
}

-(void) writeSample: (NSTimer*) _timer 
 {
if (assetWriterInput.readyForMoreMediaData) {
    // CMSampleBufferRef sample = nil;

    CVReturn cvErr = kCVReturnSuccess;

    // get screenshot image!
    CGImageRef image = (CGImageRef) [[self uiImageScreen] CGImage];
    NSLog (@"made screenshot");

    // prepare the pixel buffer
    CVPixelBufferRef pixelBuffer = NULL;
    CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
    NSLog (@"copied image data");
    cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                         FRAME_WIDTH,
                                         FRAME_HEIGHT,
                                         kCVPixelFormatType_32BGRA,
                                         (void*)CFDataGetBytePtr(imageData),
                                         CGImageGetBytesPerRow(image),
                                         NULL,
                                         NULL,
                                         NULL,
                                         &pixelBuffer);
    NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

    // calculate the time
    CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
    NSLog (@"elapsedTime: %f", elapsedTime);
    CMTime presentationTime =  CMTimeMake (elapsedTime * 600, 600);

    // write the sample
    BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

    if (appended) 
    {
        NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
    } 
    else 
    {
        NSLog (@"failed to append");

    }
}
}

然后,我調用此方法來創建電影。

-(void)StartRecording
{
 NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];
if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath]) {
    [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
}

NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
NSLog(@"path=%@",movieURL);
NSError *movieError = nil;
[assetWriter release];
assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL 
                                        fileType: AVFileTypeQuickTimeMovie 
                                           error: &movieError];
NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                          AVVideoCodecH264, AVVideoCodecKey,
                                          [NSNumber numberWithInt:320], AVVideoWidthKey,
                                          [NSNumber numberWithInt:480], AVVideoHeightKey,
                                          nil];
assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                      outputSettings:assetWriterInputSettings];
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];

[assetWriterPixelBufferAdaptor release];
assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                 initWithAssetWriterInput:assetWriterInput
                                 sourcePixelBufferAttributes:nil];
[assetWriter startWriting];

firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
[assetWriter startSessionAtSourceTime: CMTimeMake(0, 1000)];

// start writing samples to it
[assetWriterTimer release];
assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                    target:self
                                                  selector:@selector (writeSample:)
                                                  userInfo:nil
                                                   repeats:YES] ;
}

嘗試這種方法...。

if (![videoWriterInput isReadyForMoreMediaData]) {
    NSLog(@"Not ready for video data");
}
else {
    @synchronized (self) {
        UIImage* newFrame = [self.currentScreen retain];
        CVPixelBufferRef pixelBuffer = NULL;
        CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
        CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

        int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
        if(status != 0){
            //could not get a buffer from the pool
            NSLog(@"Error creating pixel buffer:  status=%d", status);
        }
        // set image data into pixel buffer
        CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
        uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
        CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);  //XXX:  will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data

        if(status == 0){
            BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
            if (!success)
                NSLog(@"Warning:  Unable to write buffer to video");
        }

        //clean up
        [newFrame release];
        CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
        CVPixelBufferRelease( pixelBuffer );
        CFRelease(image);
        CGImageRelease(cgImage);
    }

}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM