[英]How to record and save video using HTML5 WebRTC
First run code snippet then read the description ... It will give you the structure
首先运行代码片段然后阅读描述......它会给你结构
I want to record, play and save video in the 2nd video element
.我想在第二个
video element
录制、播放和保存视频。 The problem I am facing is: stream is running in the 1st video-element
but unable to record and save video我面临的问题是:流正在第一个
video-element
运行,但无法录制和保存视频
.video { border: 1px solid gray; box-shadow: 3px 4px lightgray; }
<link href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/css/bootstrap.min.css" rel="stylesheet"/> <div style="text-align:center"> <h1>Welcome to WebRTC</h1> <video class="video" #video autoplay controls></video> <video class="video" style="width:360;" autoplay controls #recordedVideo></video> <br> <button class="btn btn-warning" (click)="startRecording()">Start</button> <button class="btn btn-warning" (click)="stopRecording()">Stop</button> <button class="btn btn-warning" (click)="playRecording()">Play</button> </div>
What I did here, in Luis Estevez code, I declared the event there in startRecording
method because when I tried to push stream-chunk in blob-array, it responded an error: push method does not exist, even I created an object-array after I declared an array.我在这里所做的,在Luis Estevez代码中,我在
startRecording
方法中声明了该事件,因为当我尝试在 blob-array 中推送流块时,它响应了一个错误:push 方法不存在,即使我创建了一个对象数组在我声明一个数组之后。
startRecording(stream) {
let options = { mimeType: 'video/webm' }
this.recordedBlobs = []
console.log(this.recordedBlobs)
try {
this.mediaRecorder = new MediaRecorder(stream, options)
} catch (e0) {
console.log('Try different mimeType')
}
console.log('Created MediaRecorder', this.mediaRecorder, 'with options', options)
// this.mediaRecorder.onstop = this.handleStop
this.mediaRecorder.onstop = (event) => {
console.log('Recorder stopped: ', event)
const videoBuffer = new Blob(this.recordedBlobs, { type: 'video/webm' })
this.downloadUrl = window.URL.createObjectURL(videoBuffer) // you can download with <a> tag
this.recordVideoElement = this.recordVideoElementRef.nativeElement
this.recordVideoElement.src = this.downloadUrl
}
// this.mediaRecorder.ondataavailable = this.handleDataAvailable
this.mediaRecorder.ondataavailable = (event) => {
if (event.data && event.data.size > 0) {
this.recordedBlobs.push(event.data)
}
}
this.mediaRecorder.start(100) // collect 100ms of data
console.log('MediaRecorder started', this.mediaRecorder)
}
Thanks Luis Estevez :)
You didn't 'really' record the stream, you just copied the stream object, not the event data coming from the stream.您并没有“真正”记录流,您只是复制了流对象,而不是来自流的事件数据。
Use MediaRecorder
and pass the stream as constructor argument.使用
MediaRecorder
并将流作为构造函数参数传递。 Grab the video blob from the event handler ondataavailable.从 ondataavailable 事件处理程序中获取视频blob 。 Join the recorded array of blobs to a new Blob.
将记录的 Blob 数组加入新的 Blob。 From there you can get url using
createObbjectURL(blob);
从那里你可以使用
createObbjectURL(blob);
获取 url createObbjectURL(blob);
The follow snippet is pseudo code:以下片段是伪代码:
** typescript doesn't recognize 'MediaRecorder' so you'll have to find a way to add type any to MediaRecorder ** 打字稿无法识别“MediaRecorder”,因此您必须找到一种方法将任何类型添加到 MediaRecorder
mediaRecorder: any;
recordedBlobs: Blob[];
downloadUrl: string;
handleDataAvailable(event) {
if (event.data && event.data.size > 0) {
this.recordedBlobs.push(event.data);
}
}
handleStop(event) {
console.log('Recorder stopped: ', event);
const videoBuffer = new Blob(this.recordedBlobs, {type: 'video/webm'});
this.downloadUrl = window.URL.createObjectURL(videoBuffer); // you can download with <a> tag
this.recordVideoElement.src = this.downloadUrl;
}
startRecording(stream) {
let options = {mimeType: 'video/webm'};
this.recordedBlobs = [];
try {
this.mediaRecorder = new MediaRecorder(stream, options);
} catch (e0) {
console.log('Try different mimeType');
}
console.log('Created MediaRecorder', this.mediaRecorder, 'with options', options);
this.mediaRecorder.onstop = this.handleStop;
this.mediaRecorder.ondataavailable = this.handleDataAvailable;
this.mediaRecorder.start(100); // collect 100ms of data
console.log('MediaRecorder started', this.mediaRecorder);
}
stopRecording() {
this.mediaRecorder.stop();
console.log('Recorded Blobs: ', this.recordedBlobs);
this.recordVideoElement.controls = true;
}
playRecording() {
if (!this.recordedBlobs.length) {
console.log('cannot play.');
return;
}
this.recordVideoElement.play();
}
async ngOnInit() {
navigator.mediaDevices.getUserMedia({ video: { width: 360 } }).then(stream => {
this.videoElement.srcObject = stream
this.startRecording(stream);
})
}
@ViewChild('recordedVideo') recordVideoElementRef: ElementRef
@ViewChild('video') videoElementRef: ElementRef
videoElement: HTMLVideoElement
recordVideoElement: HTMLVideoElement
mediaRecorder: MediaRecorder
recordedBlobs: Blob[]
isRecording: boolean = false
downloadUrl: string
stream: MediaStream
constructor() {
}
async ngOnInit() {
this.videoElement = this.videoElementRef.nativeElement
this.recordVideoElement = this.recordVideoElementRef.nativeElement
navigator.mediaDevices.getUserMedia({
video: {
width: 360
}
}).then(stream => {
this.stream = stream
this.videoElement.srcObject = this.stream
})
}
startRecording() {
this.recordedBlobs = []
let options: MediaRecorderOptions = { mimeType: 'video/webm' }
try {
this.mediaRecorder = new MediaRecorder(this.stream, options)
} catch (err) {
console.log(err)
}
this.mediaRecorder.start() // collect 100ms of data
this.isRecording = !this.isRecording
this.onDataAvailableEvent()
this.onStopRecordingEvent()
}
stopRecording() {
this.mediaRecorder.stop()
this.isRecording = !this.isRecording
console.log('Recorded Blobs: ', this.recordedBlobs)
}
playRecording() {
if (!this.recordedBlobs || !this.recordedBlobs.length) {
console.log('cannot play.')
return
}
this.recordVideoElement.play()
}
onDataAvailableEvent() {
try {
this.mediaRecorder.ondataavailable = (event: BlobEvent) => {
if (event.data && event.data.size > 0) {
this.recordedBlobs.push(event.data)
}
}
} catch (error) {
console.log(error)
}
}
onStopRecordingEvent() {
try {
this.mediaRecorder.onstop = (event: Event) => {
const videoBuffer = new Blob(this.recordedBlobs, { type: 'video/webm' })
this.downloadUrl = window.URL.createObjectURL(videoBuffer) // you can download with <a> tag
this.recordVideoElement.src = this.downloadUrl
}
} catch (error) {
console.log(error)
}
}
}
<div style="text-align:center">
<h1>Welcome to WebRTC</h1>
<video class="video" #video autoplay controls></video>
<span class="m-1"></span>
<video class="video" style="width:360 !important;" controls #recordedVideo></video>
<br>
<button class="btn btn-primary btn-lg" *ngIf="!isRecording" (click)="startRecording()">Start Recording</button>
<button class="btn btn-warning btn-lg" *ngIf="isRecording" (click)="stopRecording()">Stop Recording</button>
</div>
Note: if you get an error that MediaRecorder is not found etc then do注意:如果您收到 找不到 MediaRecorder等错误,请执行以下操作
npm i @types/dom-mediacapture-record
Be sure to update your Chrome
browser too.请务必更新您的
Chrome
浏览器。
Have a Good day祝你有美好的一天
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.