繁体   English   中英

如何将从iPhone相机拍摄的视频发送到服务器进行直播?

[英]How to send the video captured from iPhone's camera to a server for live streaming?

我有一些在线代码从iPhone的相机捕获视频,然后将其存储到视频文件,它工作正常。 但我的目的不是将其保存在内存中,而是将其发送到服务器。 我发现有一个名为WOWZA的免费媒体服务器,它允许流媒体和Apple有(HSL)HTTP直播流功能,服务器希望视频为h.264格式的视频和mp3的音频。 通过阅读关于Apple HSL的一些文档,我也发现它在播放列表文件中为媒体文件的每个片段提供了不同的URL,然后通过浏览器以正确的顺序在设备上播放。 我不知道如何获取手机摄像头录制的文件的小片段,以及如何将其转换为所需的格式。 以下是捕获视频的代码:

实施档案

#import "THCaptureViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "THPlayerViewController.h"

#define VIDEO_FILE @"test.mov"

@interface THCaptureViewController ()
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput;
@property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@end

@implementation THCaptureViewController

- (void)viewDidLoad 
{
[super viewDidLoad];

    #if TARGET_IPHONE_SIMULATOR
    self.simulatorView.hidden = NO;
        [self.view bringSubviewToFront:self.simulatorView];
    #else
    self.simulatorView.hidden = YES;
    [self.view sendSubviewToBack:self.simulatorView];
    #endif

// Hide the toggle button if device has less than 2 cameras. Does 3GS support iOS 6?
self.toggleCameraButton.hidden = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] < 2;

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), 
    ^{
    [self setUpCaptureSession];
});
}

#pragma mark - Configure Capture Session

- (void)setUpCaptureSession 
{
self.captureSession = [[AVCaptureSession alloc] init];


NSError *error;

// Set up hardware devices
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (input) {
        [self.captureSession addInput:input];
        self.activeVideoInput = input;
    }
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) {
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
    if (audioInput) {
        [self.captureSession addInput:audioInput];
    }
}

//Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[self.captureSession addOutput:output];

// Setup the still image file output
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];

if ([self.captureSession canAddOutput:stillImageOutput]) {
    [self.captureSession addOutput:stillImageOutput];
}

// Start running session so preview is available
[self.captureSession startRunning];

// Set up preview layer
dispatch_async(dispatch_get_main_queue(), ^{
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.frame = self.previewView.bounds;
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]];
    [self.previewView.layer addSublayer:self.previewLayer];
});

}

#pragma mark - Start Recording

- (IBAction)startRecording:(id)sender {

if ([sender isSelected]) {
    [sender setSelected:NO];
    [self.captureOutput stopRecording];

} else {
    [sender setSelected:YES];

    if (!self.captureOutput) {
        self.captureOutput = [[AVCaptureMovieFileOutput alloc] init];
        [self.captureSession addOutput:self.captureOutput];
    }

    // Delete the old movie file if it exists
    //[[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil];

    [self.captureSession startRunning];

    AVCaptureConnection *videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:self.captureOutput.connections];

    if ([videoConnection isVideoOrientationSupported]) {
        videoConnection.videoOrientation = [self currentVideoOrientation];
    }

    if ([videoConnection isVideoStabilizationSupported]) {
        videoConnection.enablesVideoStabilizationWhenAvailable = YES;
    }

    [self.captureOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];
}

// Disable the toggle button if recording
self.toggleCameraButton.enabled = ![sender isSelected];
}

- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections {
for (AVCaptureConnection *connection in connections) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
        if ([[port mediaType] isEqual:mediaType]) {
            return connection;
        }
    }
}
return nil;
}

#pragma mark - AVCaptureFileOutputRecordingDelegate

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
if (!error) {
    [self presentRecording];
} else {
    NSLog(@"Error: %@", [error localizedDescription]);
}
}

#pragma mark - Show Last Recording

- (void)presentRecording 
{
    NSString *tracksKey = @"tracks";
    AVAsset *asset = [AVURLAsset assetWithURL:[self outputURL]];
    [asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler:^{
    NSError *error;
            AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
            if (status == AVKeyValueStatusLoaded) {
        dispatch_async(dispatch_get_main_queue(), ^{
            UIStoryboard *mainStoryboard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil];
                            THPlayerViewController *controller = [mainStoryboard instantiateViewControllerWithIdentifier:@"THPlayerViewController"];
                            controller.title = @"Capture Recording";
                            controller.asset = asset;
                            [self presentViewController:controller animated:YES completion:nil];
                    });
            }
    }];
}

#pragma mark - Recoding Destination URL

- (NSURL *)outputURL 
{
    NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSLog(@"documents Directory: %@", documentsDirectory);
    NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE];

    NSLog(@"output url: %@", filePath);
    return [NSURL fileURLWithPath:filePath];
}

@end

我发现此链接显示了如何捕获帧中的视频。 但我不确定如果以帧为单位捕获视频将帮助我将h.264格式的视频发送到服务器。 可以这样做,如果是,那么如何?

在这里 ,提出问题的人说(在问题下面的评论中)他能够成功地做到这一点,但他没有提到他是如何捕获视频的。

请告诉我应该使用哪种数据类型来获取捕获的视频的小段,以及如何以所需格式转换捕获的数据并将其发送到服务器。

你可以使用live sdk。你必须设置nginx驱动的流媒体服务器。 请关注此链接。我使用过它,这是非常有效的解决方案。 https://github.com/ltebean/Live

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM