简体   繁体   English

如何在Webrtc中保存客户端的stream帧?

[英]How to save the stream frames at the client in Webrtc?

I'm using the video conference implementation source code using webrtc and nodejs.我正在使用使用webrtc和nodejs的视频会议实现源代码

I'm sending a video from a server to a client.我正在将视频从服务器发送到客户端。 I need to compute the PSNR of the received video to compute the objective visual quality.我需要计算接收到的视频的PSNR来计算客观的视觉质量。

My concerns are:我的担忧是:

  1. how to save the streamed frames at the client, from the video component of HTML5?如何在客户端保存来自HTML5的视频组件的流式帧?
  2. If (1) is achieved, how to map the original frames with the received ones?如果实现(1),如何将原始帧与接收到的帧进行 map?

Record Audio and Video with MediaRecorder使用 MediaRecorder 录制音频和视频

I solved the problem usingMediaRecorder .我使用MediaRecorder解决了这个问题。 Using MediaRecorder API , you can start and stop the recorder, and collect the stream data as they arrive.使用MediaRecorder API ,您可以启动和停止记录器,并在 stream 数据到达时收集它们。

The MediaStream can be from: MediaStream 可以来自:

  • A getUserMedia() call.一个 getUserMedia() 调用。
  • The receiving end of a WebRTC call. WebRTC 调用的接收端。
  • A screen recording.录屏。

It support the following MIME types:它支持以下 MIME 类型:

  • audio/webm音频/网络
  • video/webm视频/网络
  • video/webm;codecs=vp8视频/webm;编解码器=vp8
  • video/webm;codecs=vp9视频/webm;编解码器=vp9

The following demo demonstrates that and the code is available as well 以下演示演示了这一点,并且代码也可用

I still have to solve the second concern?!我还要解决第二个问题?! Any idea for mapping the local and remote video frames?映射本地和远程视频帧的任何想法?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM