简体   繁体   中英

Client-Server Based Audio Live Streaming in java using Socket

Given code is intended to do live audio streaming between client and server using java socket, but problem is that when I run this project, the client starts recording the sound and send it to receiver(server) side. Then Server buffers the received the sound but does not play it simultaneously.But when the client is closed, the server starts playing the sound.Please help me. I need that server must play received sound simultaneously.

\\client

import java.io.*;
import java.net.*;
import java.sound.sampled.*;
import org.apache.commons.io.IOUtils; 
public class ClientStream{ 
public ClientStream() throws IOException{
    isl.runListener();
}

private IncomingSoundListener isl = new IncomingSoundListener();    
AudioFormat format = getAudioFormat();
InputStream is;
Socket client;
String serverName = "192.168.2.8";
int port=3000;
boolean inVoice = true;


private AudioFormat getAudioFormat(){
    float sampleRate = 16000.0F;
    int sampleSizeBits = 16;
    int channels = 1;
    boolean signed = true;
    boolean bigEndian = false;

    return new AudioFormat(sampleRate, sampleSizeBits, channels, signed, bigEndian);
}
class IncomingSoundListener {
    public void runListener(){
        try{
            System.out.println("Connecting to server:"+serverName+" Port:"+port);
            client = new Socket(serverName,port); 
            System.out.println("Connected to: "+client.getRemoteSocketAddress());
            System.out.println("Listening for incoming audio.");
            DataLine.Info speakerInfo = new DataLine.Info(SourceDataLine.class,format);
            SourceDataLine speaker = (SourceDataLine) AudioSystem.getLine(speakerInfo);
            speaker.open(format);
            speaker.start();
            while(inVoice){
                is = client.getInputStream();
                byte[] data = IOUtils.toByteArray(is);  
                ByteArrayInputStream bais = new ByteArrayInputStream(data);
                AudioInputStream ais = new AudioInputStream(bais,format,data.length);
                int bytesRead = 0;
                if((bytesRead = ais.read(data)) != -1){
                    System.out.println("Writing to audio output.");
                    speaker.write(data,0,bytesRead);

   //                 bais.reset();
                }
                ais.close();
                bais.close();

            }
           speaker.drain();
           speaker.close();
           System.out.println("Stopped listening to incoming audio.");
        }catch(Exception e){
            e.printStackTrace();
        }
    }
}
public static void main(String [] args) throws IOException{
        new ClientStream();
    }

}

\\server

import java.io.*;
import java.net.*;
import java.sound.sampled.*;
public class ServerStream {
private OutgoingSoudnListener osl = new OutgoingSoudnListener();
boolean outVoice = true;
AudioFormat format = getAudioFormat();

private ServerSocket serverSocket;
Socket server;
private AudioFormat getAudioFormat() {
    float sampleRate = 16000.0F;
    int sampleSizeBits = 16;
    int channels = 1;
    boolean signed = true;
    boolean bigEndian = false;

   return new AudioFormat(sampleRate, sampleSizeBits, channels, signed, bigEndian);
}
public ServerStream() throws IOException{
    try{
        System.out.println("Creating Socket...");
        serverSocket = new ServerSocket(3000);
        osl.runSender();
    }catch(Exception e){
        e.printStackTrace();
    }
}
class OutgoingSoudnListener{
    public void runSender(){
        try{
            server = serverSocket.accept();
            System.out.println("Listening from mic.");
            DataOutputStream out = new DataOutputStream(server.getOutputStream());
            DataLine.Info micInfo = new DataLine.Info(TargetDataLine.class,format);
            TargetDataLine mic = (TargetDataLine) AudioSystem.getLine(micInfo);
            mic.open(format);
            System.out.println("Mic open.");
            byte tmpBuff[] = new byte[mic.getBufferSize()/5];
            mic.start();
            while(outVoice) {
                System.out.println("Reading from mic.");
                int count = mic.read(tmpBuff,0,tmpBuff.length);
                if (count > 0){
                    System.out.println("Writing buffer to server.");
                    out.write(tmpBuff, 0, count);
                }               
                }
            mic.drain();
            mic.close();
            System.out.println("Stopped listening from mic.");
        }catch(Exception e){
            e.printStackTrace();
        }
    }

}
public static void main (String args[]) throws IOException{
    new ServerStream();

}

}

Client-server connection and subsequently Socket is based on the TCP protocol model. As you can confirm in their docs.

What you seek is DatagramSocket based on UDP, you might suffer the loss of packages but that's the way things work. That's how streaming video works, you get some, you lose some.

Now, you question per se, one of the problems you have while implementing with a TCP protocol is that TCP is based on acknowledgements in order to keep the communications synchronized, if either your server or client fails to acknowledge then you might experience the stream to get stuck, because of that.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM