[英]How to save the stream frames at the client in Webrtc?
I'm using the video conference implementation source code using webrtc and nodejs.我正在使用使用webrtc和nodejs的视频会议实现源代码。
I'm sending a video from a server to a client.我正在将视频从服务器发送到客户端。 I need to compute the
PSNR
of the received video to compute the objective visual quality.我需要计算接收到的视频的
PSNR
来计算客观的视觉质量。
My concerns are:我的担忧是:
Record Audio and Video with MediaRecorder使用 MediaRecorder 录制音频和视频
I solved the problem usingMediaRecorder .我使用MediaRecorder解决了这个问题。 Using MediaRecorder API , you can start and stop the recorder, and collect the stream data as they arrive.
使用MediaRecorder API ,您可以启动和停止记录器,并在 stream 数据到达时收集它们。
The MediaStream can be from: MediaStream 可以来自:
It support the following MIME types:它支持以下 MIME 类型:
The following demo demonstrates that and the code is available as well 以下演示演示了这一点,并且代码也可用
I still have to solve the second concern?!我还要解决第二个问题?! Any idea for mapping the local and remote video frames?
映射本地和远程视频帧的任何想法?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.