简体   繁体   English

如何构建低延迟音频流 android 应用程序

[英]How to build a low latency audio streaming android app

We are trying to stream live audio record from the phone's mic to a server but experiencing a 200ms delay, and would like to minimize it.我们正在尝试 stream 从手机的麦克风到服务器的实时音频记录,但遇到 200 毫秒的延迟,并希望将其最小化。

Is there a good API/an efficient way or protocol to do this?有没有好的 API/有效的方法或协议来做到这一点? we though about using SIP/RTP protocol somehow, would it be more efficient?我们虽然想以某种方式使用 SIP/RTP 协议,它会更有效吗?

here's the code, what would you recommend changing?这是代码,您建议更改什么?

package com.awesome.audiostream;

import java.io.IOException;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.UnknownHostException;

import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;

public class MainActivity extends Activity {
    private Button startButton,stopButton;

    public byte[] buffer;
    public static DatagramSocket socket;
    private int port=12345;

    AudioRecord recorder;

    private int sampleRate = 176400;
    private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
    private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
    int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);

    private boolean status = true;


    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        //setContentView(R.layout.activity_main);
        View mv = getLayoutInflater().inflate(R.layout.activity_main, null);
        setContentView(mv);

        startButton = (Button) findViewById (R.id.start_button);
        stopButton = (Button) findViewById (R.id.stop_button);

        startButton.setOnClickListener (startListener);
        stopButton.setOnClickListener (stopListener);

    }

    private final OnClickListener stopListener = new OnClickListener() {

        @Override
        public void onClick(View arg0) {
            status = false;
            recorder.release();
            Log.d("VS","Recorder released");
        }
    };

    private final OnClickListener startListener = new OnClickListener() {

        @Override
        public void onClick(View arg0) {
            status = true;
            startStreaming();
        }
    };

    public void startStreaming() {
        Thread streamThread = new Thread(new Runnable() {

            @Override
            public void run() {
                try {
                    DatagramSocket socket = new DatagramSocket();
                    Log.d("VS", "Socket Created");

                    byte[] buffer = new byte[minBufSize];

                    Log.d("VS","Buffer created of size " + minBufSize);
                    DatagramPacket packet;

                    // The IP address of the server receiving the audio stream
                    final InetAddress destination = InetAddress.getByName("192.168.43.204");
                    Log.d("VS", "Address retrieved");

                    recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*10);
                    Log.d("VS", "Recorder initialized");
                    recorder.startRecording();

                    while(status) {
                        //reading data from MIC into buffer
                        minBufSize = recorder.read(buffer, 0, buffer.length);

                        //putting buffer in the packet
                        packet = new DatagramPacket (buffer,buffer.length,destination,port);

                        socket.send(packet);
                        System.out.println("MinBufferSize: " +minBufSize);
                    }

                } catch(UnknownHostException e) {
                    Log.e("VS", "UnknownHostException");
                } catch (IOException e) {
                    e.printStackTrace();
                    Log.e("VS", "IOException");
                }
            }

        });
        streamThread.start();
    }
}

Late answer, but, for what it may be worth, the main bottleneck on Android is usually the audio capture minimum buffer size, not many devices are able to provide small buffers.迟到的答案,但是,对于它可能的价值,Android 的主要瓶颈通常是音频捕获最小缓冲区大小,没有多少设备能够提供小缓冲区。 What values are you getting?你得到什么价值?

Regarding APIs, you may want to check out this documentation, if you haven't yet:关于 API,您可能需要查看此文档(如果您还没有的话):

https://developer.android.com/ndk/guides/audio/audio-latencyhttps://developer.android.com/ndk/guides/audio/audio-latency

Then you have the buffering on the receiver side, to compensate for momentary high network loads where the desired audio bitrate cannot be met: it's a trade-off between latency and stability of the stream, a buffer underrun will translate in audible artifacts.然后,您在接收器端进行缓冲,以补偿无法满足所需音频比特率的瞬时高网络负载:这是 stream 的延迟和稳定性之间的权衡,缓冲区欠载将转换为可听伪影。 You need to select a transmission protocol, a codec and a media player that allows you to minimize that buffering.您需要 select 一个传输协议、一个编解码器和一个媒体播放器,以使您能够最大限度地减少缓冲。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM