简体   繁体   中英

Regarding video and audio stream over socket.io from android camera live

I want to stream real time video from android camera to device via socket.io, But i am not able to send audio

As i am using surfaceview to create camra view

On Camera preview i am getting video bytes

But bytes not contain any Audio, it just contain video frame

Help me how can i merge video and audio at same time to byte array and send to the other device, while recording and on other side how i can decode

Code i am using is like this

 mCamera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback() {

            private long timestamp = 0;

            public synchronized void onPreviewFrame(byte[] data, Camera camera) {

                int size = data.length;
                Camera.Size previewSize = camera.getParameters().getPreviewSize();
                YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, previewSize.width, previewSize.height, null);
                ByteArrayOutputStream baos = new ByteArrayOutputStream();
                yuvimage.compressToJpeg(new Rect(0, 0, previewSize.width, previewSize.height), 80, baos);
//                yuvimage.compressToJpeg(new Rect(0, 0, 128, 96), 80, baos);
                byte[] jdata = baos.toByteArray();
//                int sizeOfData = jdata.length;

                DatagramSocket s;
                try {
                    s = new DatagramSocket();
                    s.setBroadcast(true);
                    s.setSoTimeout(TIMEOUT_MS);
                    InetAddress local = InetAddress.getByName(IPAdresse.getText().toString());

                    DatagramPacket p = new DatagramPacket(jdata, jdata.length, local, server_port);
                    s.send(p);

                } catch (SocketException e) {
                    e.printStackTrace();
                } catch (UnknownHostException e) {
                    e.printStackTrace();
                } catch (IOException e) {
                    e.printStackTrace();
                }

//                // Convert to Bitmap
                Bitmap bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
                m_VideCaptureBitmap.setImageBitmap(bmp);

                Log.v("CameraTest", "Frame size = " + data.length);
                timestamp = System.currentTimeMillis();
                try {
                    camera.addCallbackBuffer(data);
                } catch (Exception e) {
                    Log.e("CameraTest", "addCallbackBuffer error");
                    return;
                }
                return;
            }
        });

        try {
            mCamera.startPreview();
        } catch (Throwable e) {
            mCamera.release();
            mCamera = null;
            e.printStackTrace();
            return;
        }

You need to desing/use some protocol to share one connection for sending video&audio data, or you can just use another connection (eg designated by a server port) for audio only, so you'll have one connection for audio and another for video.

As to how to grab audio from android mic into byte array there's already an answer here: android record mic to ByteArray without saving audio file

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM