简体   繁体   English

关于来自android camera live的socket.io上的视频和音频流

[英]Regarding video and audio stream over socket.io from android camera live

I want to stream real time video from android camera to device via socket.io, But i am not able to send audio 我想通过socket.io将实时视频从android摄像机流传输到设备,但是我无法发送音频

As i am using surfaceview to create camra view 当我使用Surfaceview创建camra视图时

On Camera preview i am getting video bytes 在摄像头预览中,我正在获取视频字节

But bytes not contain any Audio, it just contain video frame 但是字节不包含任何音频,它仅包含视频帧

Help me how can i merge video and audio at same time to byte array and send to the other device, while recording and on other side how i can decode 帮助我如何同时将视频和音频合并到字节数组并发送到另一台设备,同时在录制时以及如何解码

Code i am using is like this 我正在使用的代码是这样的

 mCamera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback() {

            private long timestamp = 0;

            public synchronized void onPreviewFrame(byte[] data, Camera camera) {

                int size = data.length;
                Camera.Size previewSize = camera.getParameters().getPreviewSize();
                YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, previewSize.width, previewSize.height, null);
                ByteArrayOutputStream baos = new ByteArrayOutputStream();
                yuvimage.compressToJpeg(new Rect(0, 0, previewSize.width, previewSize.height), 80, baos);
//                yuvimage.compressToJpeg(new Rect(0, 0, 128, 96), 80, baos);
                byte[] jdata = baos.toByteArray();
//                int sizeOfData = jdata.length;

                DatagramSocket s;
                try {
                    s = new DatagramSocket();
                    s.setBroadcast(true);
                    s.setSoTimeout(TIMEOUT_MS);
                    InetAddress local = InetAddress.getByName(IPAdresse.getText().toString());

                    DatagramPacket p = new DatagramPacket(jdata, jdata.length, local, server_port);
                    s.send(p);

                } catch (SocketException e) {
                    e.printStackTrace();
                } catch (UnknownHostException e) {
                    e.printStackTrace();
                } catch (IOException e) {
                    e.printStackTrace();
                }

//                // Convert to Bitmap
                Bitmap bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
                m_VideCaptureBitmap.setImageBitmap(bmp);

                Log.v("CameraTest", "Frame size = " + data.length);
                timestamp = System.currentTimeMillis();
                try {
                    camera.addCallbackBuffer(data);
                } catch (Exception e) {
                    Log.e("CameraTest", "addCallbackBuffer error");
                    return;
                }
                return;
            }
        });

        try {
            mCamera.startPreview();
        } catch (Throwable e) {
            mCamera.release();
            mCamera = null;
            e.printStackTrace();
            return;
        }

You need to desing/use some protocol to share one connection for sending video&audio data, or you can just use another connection (eg designated by a server port) for audio only, so you'll have one connection for audio and another for video. 您需要设计/使用某种协议来共享一个连接以发送视频和音频数据,或者您可以仅将另一个连接(例如,由服务器端口指定)仅用于音频,因此您将拥有一个用于音频的连接和另一个用于视频的连接。

As to how to grab audio from android mic into byte array there's already an answer here: android record mic to ByteArray without saving audio file 至于如何将音频从android麦克风捕获到字节数组中,这里已经有了答案: android将麦克风录制到ByteArray而不保存音频文件

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM