[英]How to merge the Live streaming video and audio in iphone sdk
我正在IOS中開發基於視頻的應用程序。 在我的應用程序中,我需要合並視頻和音頻。我已經合並了本地視頻和音頻文件,但是如果我嘗試合並它們,則無法合並實時流式視頻和音頻,這意味着該應用程序由於持續時間而崩潰了。 對於合並,我使用以下代碼
-(void) playerFunction
{
NSURL *url = [NSURL URLWithString:@"http://www.digdang.com/media/VideoFolde/017141.mp4"];
// NSURL *url = [NSURL URLWithString:@"http://www.educator.com:1935/mobile/mp4:testVideo.mp4/playlist.m3u8"];// these is totally not working
NSString* audio_inputFileName = @"audio.mp3";
NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@",[[NSBundle mainBundle] resourcePath],audio_inputFileName];
NSURL* audiopath = [NSURL fileURLWithPath:audio_inputFilePath];
NSString* videoName = @"output.mov";//outputdata
NSString *savepath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSFileManager *filemgr = [NSFileManager defaultManager];
if ([filemgr fileExistsAtPath:savepath ] == YES){
[[NSFileManager defaultManager] removeItemAtPath:savepath error:nil];
}
else
{
NSLog (@"File not found");
}
CMTime nextClipStartTime = kCMTimeZero;
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:options];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL: audiopath options:options];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSLog(@"%@",[videoAsset tracksWithMediaType:AVMediaTypeVideo]);
NSLog(@"%@",[audioAsset tracksWithMediaType:AVMediaTypeAudio]);
NSLog(@"%f", CMTimeGetSeconds(videoAsset.duration));
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
[a_compositionVideoTrack scaleTimeRange:video_timeRange toDuration:audioAsset.duration];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
_assetExport.shouldOptimizeForNetworkUse = YES;
_assetExport.outputFileType = @"com.apple.quicktime-movie";
NSURL *savetUrl = [NSURL fileURLWithPath:savepath];
_assetExport.outputURL = savetUrl;
_assetExport.timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch (_assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(@"Export Complete");
//------>>> // From Here I want play movie using MPMoviePlayerController.<<<---------
[self play];
// [self performSelector:@selector(play) withObject:nil afterDelay:2.0];
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
// export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
// export cancelled
break;
}
}
];
NSLog(@"savepath :%@",savepath);
}
請一些身體幫助我
您沒有給正確的實時視頻URL播放。
在您的代碼中,您已經在文檔目錄中傳遞了本地視頻URL。 NSString *fileNamePath1 = @"Egg_break.mov";
而不是此,您應該從服務器傳遞一些URLString。
例如
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:@"http://qtdevseed.apple.com/addemo/ad.m3u8"] options:nil];
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.