简体   繁体   中英

POST Streaming Audio over HTTP/2 in Android

Some background:

I am trying to develop a voice-related feature on the android app where a user can search using voice and the server sends intermediate results while user is speaking (which in turn updates the UI) and the final result when the query is complete. Since the server accepts only HTTP/2 single socket connection and Android HTTPUrlConnection doesn't support HTTP/2 yet, I am using Retrofit2.

I have looked at this , this and this but each example has fixed length data or the size can be determined beforehand... which is not the case for audio search.

Here's what my method for POST looks like:

  public interface Service{
    @Streaming
    @Multipart
    @POST("/api/1.0/voice/audio")
    Call<ResponseBody> post(
            @Part("configuration") RequestBody configuration,
            @Part ("audio") RequestBody audio);
}

The method sends configuration file(containing audio parameters - JSON structure) and streaming audio in the following manner. (Expected POST request)

Content-Type = multipart/form-data;boundary=----------------------------41464684449247792368259
//HEADERS
----------------------------414646844492477923682591
Content-Type: application/json; charset=utf-8
Content-Disposition: form-data; name="configuration"
//JSON data structure with different audio parameters.
----------------------------414646844492477923682591
Content-Type: audio/wav; charset=utf-8
Content-Disposition: form-data; name="audio"
<audio_data>
----------------------------414646844492477923682591--

Not really sure about how to send streaming(!!) <audio_data> . I tried using Okio to create multipart for audio in this way (From: https://github.com/square/okhttp/wiki/Recipes#post-streaming )

public RequestBody createPartForAudio(final byte[] samples){
        RequestBody requestBody = new RequestBody() {
            @Override
            public MediaType contentType() {
                return MediaType.parse("audio/wav; charset=utf-8");
            }

            @Override
            public void writeTo(BufferedSink sink) throws IOException {
                //Source source = null;
                sink.write(samples);        

            }
        };

        return requestBody;
    }

This didn't work of course. Is this a right way to keep on writing audio samples to ResponseBody? Where exactly should I call Service.post(config, audio) method so that I don't end up posting configuration file every time there is something in the audio buffer.

Also, since I have to keep on sending streaming audio, how can I keep the same POST connection open and not close it until user has stopped speaking?

I am basically new to OkHttp and Okio. If I have missed anything or part of the code is not clear please let me know and I'll upload that snippet. Thank you.

You might be able to use a Pipe to produce data from your audio thread and consume it on your networking thread.

From a newly-created OkHttp recipe :

/**
 * This request body makes it possible for another
 * thread to stream data to the uploading request.
 * This is potentially useful for posting live event
 * streams like video capture. Callers should write
 * to {@code sink()} and close it to complete the post.
 */
static final class PipeBody extends RequestBody {
  private final Pipe pipe = new Pipe(8192);
  private final BufferedSink sink = Okio.buffer(pipe.sink());

  public BufferedSink sink() {
    return sink;
  }

  @Override public MediaType contentType() {
    ...
  }

  @Override public void writeTo(BufferedSink sink) throws IOException {
    sink.writeAll(pipe.source());
  }
}

This approach will work best if your data can be written as a continuous stream. If it can't, you might be better off doing something similar with a BlockingQueue<byte[]> or similar.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM