简体   繁体   English

如何在iPhone SDK中合并实时流视频和音频

[英]How to merge the Live streaming video and audio in iphone sdk

I am developing video based application in IOS. 我正在IOS中开发基于视频的应用程序。 In my application I need to merge video and audio.I have merged the local video and audio file but I am not able to merge the live streaming video and audio if I try to merge them means the app is crashed because of time duration. 在我的应用程序中,我需要合并视频和音频。我已经合并了本地视频和音频文件,但是如果我尝试合并它们,则无法合并实时流式视频和音频,这意味着该应用程序由于持续时间而崩溃了。 For merging I am using the below code 对于合并,我使用以下代码

    -(void) playerFunction
    {
             NSURL *url = [NSURL URLWithString:@"http://www.digdang.com/media/VideoFolde/017141.mp4"];
    
//    NSURL *url = [NSURL URLWithString:@"http://www.educator.com:1935/mobile/mp4:testVideo.mp4/playlist.m3u8"];// these is totally not working
    
    
    
    NSString* audio_inputFileName = @"audio.mp3";
    NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@",[[NSBundle mainBundle] resourcePath],audio_inputFileName];
    NSURL*    audiopath = [NSURL fileURLWithPath:audio_inputFilePath];

    NSString* videoName = @"output.mov";//outputdata
    NSString *savepath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
    
    NSFileManager *filemgr = [NSFileManager defaultManager];
    
    
    if ([filemgr fileExistsAtPath:savepath ] == YES){
        [[NSFileManager defaultManager] removeItemAtPath:savepath error:nil];
    }
    else
    {
        NSLog (@"File not found");
    }
    CMTime nextClipStartTime = kCMTimeZero;
    
    AVMutableComposition* mixComposition = [AVMutableComposition composition];
    NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:options];
    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL: audiopath options:options];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    NSLog(@"%@",[videoAsset tracksWithMediaType:AVMediaTypeVideo]);
    NSLog(@"%@",[audioAsset tracksWithMediaType:AVMediaTypeAudio]);
    NSLog(@"%f", CMTimeGetSeconds(videoAsset.duration));
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    [a_compositionVideoTrack scaleTimeRange:video_timeRange toDuration:audioAsset.duration];
    
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
    
    
    
    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
    _assetExport.shouldOptimizeForNetworkUse = YES;
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    
  
    NSURL    *savetUrl = [NSURL fileURLWithPath:savepath];

    _assetExport.outputURL = savetUrl;
    _assetExport.timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    
    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (_assetExport.status)
         {
             case AVAssetExportSessionStatusCompleted:
                 //   export complete
                 
                 NSLog(@"Export Complete");
                 //------>>> // From Here I want play movie using MPMoviePlayerController.<<<---------
                 [self play];
//                 [self performSelector:@selector(play) withObject:nil afterDelay:2.0];
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
                 
                 //                export error (see exportSession.error)
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
                 
                 //                export cancelled
                 break;
                 
         }
         
         
     }
     
     ];

    
    NSLog(@"savepath :%@",savepath);
    }

Please some body help me 请一些身体帮助我

You are not giving right Live Video URL to play. 您没有给正确的实时视频URL播放。

In your code you have passed local video URL inside documents directory. 在您的代码中,您已经在文档目录中传递了本地视频URL。 NSString *fileNamePath1 = @"Egg_break.mov"; instead of this u should pass some URLString from server. 而不是此,您应该从服务器传递一些URLString。

eg 例如

AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:@"http://qtdevseed.apple.com/addemo/ad.m3u8"] options:nil];

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM