[英]Create video stream from html5 <canvas>
I have a audio visualizer written in JS which draws on a <canvas>
element. 我有一个用JS编写的音频可视化工具,它使用<canvas>
元素绘制。
Is it possible (without screen-capture) to turn that <canvas>
into a (realtime) video stream? 是否可以(没有屏幕捕获)将该<canvas>
转换为(实时)视频流? Perhaps somehow write it to a socket directly. 也许以某种方式将其直接写入套接字。
the JS uses THREE.js for rendering. JS使用THREE.js进行渲染。
Preferrably I'd like to be able to run this on a webserver, it's probably not possible to do this without actually using a browser, but if it is, I'd be very happy to hear about it ;) 最好是我希望能够在网络服务器上运行此程序,如果不实际使用浏览器,可能无法执行此操作,但是如果是这样,我很高兴听到它;)
Using the info from Blindman67 I've managed to figure out a way of achieving the desired result. 使用Blindman67的信息,我设法找到了一种实现所需结果的方法。
I will end up using PhantomJS and have it write images to a /dev/stdout (or other socket) and use ffmpeg to turn that into a videostream. 我将最终使用PhantomJS并使其将图像写入/ dev / stdout(或其他套接字),然后使用ffmpeg将其转换为视频流。 (sort of as described in this question ) (如本问题所述 )
I will also run a test using Whammy but as described in the github that might not produce the desired result; 我还将使用Whammy进行测试,但是如github中所述,可能无法产生预期的结果; only 1 way to find out. 只有一种方法可以找出答案。
Edit: I will also try the suggestion from kaiido to use WebRTC 编辑:我还将尝试从kaiido使用WebRTC的建议
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.