简体   繁体   English

如何从客户端到 flask 服务器并返回客户端的 stream 实时视频帧?

[英]How to stream live video frames from client to flask server and back to the client?

I am trying to build a client server architecture where I am capturing the live video from user's webcam using getUserMedia().我正在尝试构建一个客户端服务器架构,在该架构中我使用 getUserMedia() 从用户的网络摄像头捕获实时视频。 Now instead of showing video directly in <video> tag, I want to send it to my flask server, do some processing on frames and throw it back to my web page.现在,我不想直接在<video>标签中显示视频,而是想将其发送到我的 flask 服务器,对帧进行一些处理并将其返回到我的 web 页面。

I have used socketio for creating a client-server connection.我使用 socketio 来创建客户端-服务器连接。 This is the script in my index.html .这是我index.html中的脚本。 Please pardon my mistakes or any wrong code.请原谅我的错误或任何错误的代码。

<div id="container">
    <video autoplay="true" id="videoElement">

    </video>
</div>
<script type="text/javascript" charset="utf-8">

    var socket = io('http://127.0.0.1:5000');

    // checking for connection
    socket.on('connect', function(){
      console.log("Connected... ", socket.connected)
    });

    var video = document.querySelector("#videoElement");


    // asking permission to access the system camera of user, capturing live 
    // video on getting true.

    if (navigator.mediaDevices.getUserMedia) {
      navigator.mediaDevices.getUserMedia({ video: true })
        .then(function (stream) {

          // instead of showing it directly in <video>, I want to send these frame to server

          //video_t.srcObject = stream

          //this code might be wrong, but this is what I want to do.
          socket.emit('catch-frame', { image: true, buffer: getFrame() });
        })
        .catch(function (err0r) {
          console.log(err0r)
          console.log("Something went wrong!");
        });
    }

    // returns a frame encoded in base64
    const getFrame = () => {
        const canvas = document.createElement('canvas');
        canvas.width = video_t.videoWidth;
        canvas.height = video_t.videoHeight;
        canvas.getContext('2d').drawImage(video_t, 0, 0);
        const data = canvas.toDataURL('image/png');
        return data;
    }


    // receive the frame from the server after processed and now I want display them in either 
    // <video> or <img>
    socket.on('response_back', function(frame){

      // this code here is wrong, but again this is what something I want to do.
      video.srcObject = frame;
    });

</script>

In my app.py -在我的app.py -

from flask import Flask, render_template
from flask_socketio import SocketIO, emit

app = Flask(__name__)
socketio = SocketIO(app)

@app.route('/', methods=['POST', 'GET'])
def index():
    return render_template('index.html')

@socketio.on('catch-frame')
def catch_frame(data):

    ## getting the data frames

    ## do some processing 

    ## send it back to client
    emit('response_back', data)  ## ??


if __name__ == '__main__':
    socketio.run(app, host='127.0.0.1')

I also have thought to do this by WebRTC, but I am only getting code for peer to peer.我也曾想过通过 WebRTC 来做到这一点,但我只是获取点对点的代码。

So, can anyone help me with this?那么,任何人都可以帮我解决这个问题吗? Thanks in advance for help.提前感谢您的帮助。

So, what I was trying to do is to take the real time video stream captured by the client's webcam and process them at backend.所以,我想做的是获取客户端网络摄像头捕获的实时视频 stream 并在后端处理它们。

My backend code is written in Python and I am using SocketIo to send the frames from frontend to backend.我的后端代码是用 Python 编写的,我使用 SocketIo 将帧从前端发送到后端。 You can have a look at this design to get a better idea about what's happening - image您可以查看此设计以更好地了解正在发生的事情 -图片

  1. My server(app.py) will be running in backend and client will be accessing index.html我的服务器(app.py)将在后端运行,客户端将访问 index.html
  2. SocketIo connection will get establish and video stream captured using webcam will be send to server frames by frames. SocketIo 连接将建立,使用网络摄像头捕获的视频 stream 将逐帧发送到服务器。
  3. These frames will be then processed at the backend and emit back to the client.然后这些帧将在后端处理并发送回客户端。
  4. Processed frames coming form the server can be shown in img tag.来自服务器的已处理帧可以显示在 img 标签中。

Here is the working code -这是工作代码 -

app.py应用程序.py

@socketio.on('image')
def image(data_image):
    sbuf = StringIO()
    sbuf.write(data_image)

    # decode and convert into image
    b = io.BytesIO(base64.b64decode(data_image))
    pimg = Image.open(b)

    ## converting RGB to BGR, as opencv standards
    frame = cv2.cvtColor(np.array(pimg), cv2.COLOR_RGB2BGR)

    # Process the image frame
    frame = imutils.resize(frame, width=700)
    frame = cv2.flip(frame, 1)
    imgencode = cv2.imencode('.jpg', frame)[1]

    # base64 encode
    stringData = base64.b64encode(imgencode).decode('utf-8')
    b64_src = 'data:image/jpg;base64,'
    stringData = b64_src + stringData

    # emit the frame back
    emit('response_back', stringData)

index.html索引.html

<div id="container">
    <canvas id="canvasOutput"></canvas>
    <video autoplay="true" id="videoElement"></video>
</div>

<div class = 'video'>
    <img id="image">
</div>

<script>
    var socket = io('http://localhost:5000');

    socket.on('connect', function(){
        console.log("Connected...!", socket.connected)
    });

    const video = document.querySelector("#videoElement");

    video.width = 500; 
    video.height = 375; ;

    if (navigator.mediaDevices.getUserMedia) {
        navigator.mediaDevices.getUserMedia({ video: true })
        .then(function (stream) {
            video.srcObject = stream;
            video.play();
        })
        .catch(function (err0r) {
            console.log(err0r)
            console.log("Something went wrong!");
        });
    }

    let src = new cv.Mat(video.height, video.width, cv.CV_8UC4);
    let dst = new cv.Mat(video.height, video.width, cv.CV_8UC1);
    let cap = new cv.VideoCapture(video);

    const FPS = 22;

    setInterval(() => {
        cap.read(src);

        var type = "image/png"
        var data = document.getElementById("canvasOutput").toDataURL(type);
        data = data.replace('data:' + type + ';base64,', ''); //split off junk 
        at the beginning

        socket.emit('image', data);
    }, 10000/FPS);


    socket.on('response_back', function(image){
        const image_id = document.getElementById('image');
        image_id.src = image;
    });

</script>

Also, websockets runs on secure origin.此外,websockets 在安全源上运行。

I had to tweak your solution a bit:-我不得不稍微调整一下您的解决方案:-

I commented the three cv variables and the cap.read(src) statement, modified the following line我注释了三个 cv 变量和 cap.read(src) 语句,修改了以下行

var data = document.getElementById("canvasOutput").toDataURL(type);

to

        var video_element = document.getElementById("videoElement")
        var frame = capture(video_element, 1)
        var data = frame.toDataURL(type);

Using the capture function from here:- http://appcropolis.com/blog/web-technology/using-html5-canvas-to-capture-frames-from-a-video/从这里使用捕获 function:- http://appcropolis.com/blog/web-technology/using-html5-canvas-to-capture-frames-from-a-video/

I'm not sure if this is the right way to do it but it happened to work for me.我不确定这是否是正确的方法,但它碰巧对我有用。

Like I said I'm not super comfortable with javascript so instead of manipulating the base64 string in javascript, I'd much rather just send the whole data from javascript and parse it in python this way Like I said I'm not super comfortable with javascript so instead of manipulating the base64 string in javascript, I'd much rather just send the whole data from javascript and parse it in python this way

# Important to only split once
headers, image = base64_image.split(',', 1) 

My takeaway from this, at the risk of sounding circular, is that you can't directly pull an image string out of a canvas that is containing a video element, you need to create a new canvas onto which you draw a 2D image of the frame you capture from the video element.我从中得出的结论是,冒着听起来循环的风险,您不能直接从包含视频元素的 canvas 中拉出图像字符串,您需要创建一个新的 canvas 在其上绘制二维图像您从视频元素中捕获的帧。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法通过flask-socketIO服务器在HTML客户端中显示来自python客户端的视频流 - Cant display my video stream from a python client in the HTML client through a flask-socketIO server 如何将捕获的视频 stream 帧从 html 发送到 flask 服务器? - How can I send captured video stream frames from html to flask server? 通过服务器将视频从一个客户端传输到另一个客户端 - Stream video from one client to another client through a server 如何在Webrtc中保存客户端的stream帧? - How to save the stream frames at the client in Webrtc? 如何从Node.js服务器流到客户端 - how to stream from nodejs server to client 有没有办法在没有 WebRTC 的情况下从客户端 A -&gt; 服务器 -&gt; 客户端 B 的 stream 视频进行一对多广播? - Is there a way to stream video from client A -> server -> client B for one-to-many broadcast without WebRTC? 将视频的一部分流传输到客户端 - Stream part of the video to the client 如何将信息从客户端发送到服务器(Flask-python) - How to send information from client to server (Flask - python) 将存储在S3上的文件流传输到客户端,而无需将其下载到Flask服务器 - Stream files stored on S3 to client without downloading it to flask server 客户端可以“等待”来自服务器的流吗? - Can a Client "await" for a stream from a Server?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM