[英]How to send the video captured from iPhone's camera to a server for live streaming?
[英]How to display the video captured from iPhone's camera on the screen without saving it?
我是iOS和多媒体开发的新手,我试图从iPhone的摄像头捕获视频并将其显示在屏幕上而不将其保存在内存中。
到目前为止,借助在线示例代码,我已经编写了以下代码:
头文件:
// ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@class AVCamCaptureManager, AVCamPreviewView, AVCaptureVideoPreviewLayer;
@interface ViewController : UIViewController <UIImagePickerControllerDelegate>
@property (weak, nonatomic) IBOutlet UIView *previewView;
- (IBAction)StartCapture:(id)sender;
- (void)setCaptureSession;
@end
实施文件
// ViewController.m
#import "ViewController.h"
@interface ViewController ()
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureOutput;
@property (nonatomic, weak) AVCaptureDeviceInput *activeVideoInput;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@end
@implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)StartCapture:(id)sender
{
if ([sender isSelected])
{
[sender setSelected:NO];
[self.captureOutput stopRecording];
}
else
{
[sender setSelected:YES];
if (!self.captureOutput)
{
self.captureOutput = [[AVCaptureMovieFileOutput alloc] init];
[self.captureSession addOutput:self.captureOutput];
}
[self.captureSession startRunning];
}
}
- (void)setCaptureSession
{
self.captureSession = [[AVCaptureSession alloc] init];
NSError *error;
// Set up hardware devices
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice)
{
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (input)
{
[self.captureSession addInput:input];
self.activeVideoInput = input;
}
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
if (audioDevice) {
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (audioInput) {
[self.captureSession addInput:audioInput];
}
}
// Start running session so preview is available
[self.captureSession startRunning];
// Set up preview layer
dispatch_async(dispatch_get_main_queue(), ^{
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.frame = self.previewView.bounds;
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//[[self.previewLayer connection] setVideoOrientation:[self currentVideoOrientation]];
[self.previewView.layer addSublayer:self.previewLayer];
});
}
// Re-enable capture session if not running currently
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
if (![self.captureSession isRunning])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self.captureSession startRunning];
});
}
}
// Stop running capture session when this view disappears
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
if ([self.captureSession isRunning])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self.captureSession stopRunning];
});
}
}
@end
请告诉我我应该在IBAction方法中写什么(按下捕获视频按钮时会调用该方法),以及如何使用AVCaptureVideoPreviewLayer使捕获的视频显示在屏幕上。
谢谢。
我发现了我所缺少的。 对我来说,这样做真的很愚蠢。 我根本没有调用函数setCapture
。 因此,屏幕上看不到任何东西。
我只是在viewDidLoad函数中调用setCapture
并解决了问题。
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setCaptureSession];
}
更改手机的方向后,视频仍然存在问题,我将不得不处理该问题,但是现在,在屏幕上可以看到相机捕获的所有内容。
您需要一个AVCapturePreviewLayer。
至于在IBAction方法中编写什么内容,这将取决于您希望应用程序如何表现。 如果您希望它随捕获会话一起加载并运行预览,则实际上不需要按钮。 如果要通过按钮启动捕获会话,只需将PreviewLayer代码放入相同的方法中(在初始化捕获会话之后)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.