简体   繁体   English

从Android到PC /网络流式传输音频和视频。

[英]Streaming audio and video from Android to PC/web.

I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream's or Qik's (yeah, I know not the best idea for a beginner). 我最近刚开始使用Android SDK,这个项目的总体目标是创建一个非常类似于Ustream或Qik的应用程序(是的,我不知道初学者的最佳想法)。 I need to stream live audio and video to the web. 我需要将实时音频和视频流式传输到网络上。 There will be a video server, most likely using Wowza, handling the encoding of the videos to the proper format. 将有一个视频服务器,很可能使用Wowza,将视频编码处理为正确的格式。

From what I have found so far, I need to use android's MediaRecorder with the camera as the source and direct the output to the server. 从我到目前为止所发现的,我需要使用android的MediaRecorder将相机作为源并将输出定向到服务器。 That makes sense to me, but I do not know exactly how to go about doing that. 这对我来说很有意义,但我不确切知道如何去做。 Can anyone give me a push in the right direction? 任何人都可以给我一个正确的方向吗? I have browsed through an example at "http://ipcamera-for-android.googlecode.com/svn/trunk", but that appears to be far more complicated than necessary for what I need to do and I have been unable to get it working in eclipse to test it anyway. 我浏览了一下“http://ipcamera-for-android.googlecode.com/svn/trunk”中的一个例子,但这似乎比我需要做的事情要复杂得多,我一直无法得到无论如何,它在eclipse中工作以测试它。

Doing so is not simple but possible. 这样做并不简单但可行。

The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container. MediaRecorder API假定输出是随机访问文件,这意味着它可以前进和后退以写入mp4(或其他)文件容器。 As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access. 正如您在ipcamera-for-android中看到的那样,输出文件被定向到一个非随机访问的套接字。 The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. 事实上,解析传出流很困难,因为MediaRecorder API只会在录制完成后“写入”某些数据,如fps,sps / pps(在h264上)等等。 The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file. API将尝试回寻到流的开头(文件头存在的位置),但由于流被发送到套接字而不是文件,因此它将失败。

Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream. 拿ipcamera-for-android是一个很好的参考,如果我没记错,在流式传输之前,它会将视频记录到文件中,打开标题并从那里获取所需内容,然后开始录制到套接字并使用从头部获取的数据,以便解析流。

You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames. 在解析mp4(或您想要使用的其他文件容器)时,您还需要一些基本的了解才能捕获帧。 You can do that either on the device or on the server side. 您可以在设备上或服务器端执行此操作。

Here is a good start for writing the stream to a socket: Tutorial 这是将流写入套接字的良好开端: 教程

I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort. 我希望它有用,没有很好的教程来解析和解码传出流,因为它不是那么简单......但同样,它可以通过一些努力。

Take a look also here to see how to direct the output stream to a stream that can be sent to the server: MediaRecorder Question 另请查看如何将输出流定向到可以发送到服务器的流: MediaRecorder Question

SipDroid does exactly what you need. SipDroid正是您所需要的。

It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. 它涉及到一个黑客来规避需要文件描述符的MediaRecorder类的限制。 It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish). 它将MediaRecorder视频流的结果保存到本地套接字 (用作一种管道),然后从另一端的此套接字重新读取 (在同一应用程序中,但在另一个线程中),从接收端创建RTP数据包数据,最后将RTP数据包广播到网络(您可以根据需要使用广播或单播模式)。

Basically it boils down to the following (simplified code): 基本上它归结为以下(简化代码):

// Create a MediaRecorder
MediaRecorder mr = new MediaRecorder();
// (Initialize mr as usual)
// Create a LocalServerSocket
LocalServerSocket lss = new LocalServerSocket("foobar");
// Connect both end of this socket
LocalSocket sender = lss.accept();
LocalSocket receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress("foobar"));
// Set the output of the MediaRecorder to the sender socket file descriptor
mr.setOutputFile(sender.getFileDescriptor());
// Start the video recording:
mr.start();
// Launch a background thread that will loop, 
// reading from the receiver socket,
// and creating a RTP packet out of read data.
RtpSocket rtpSocket = new RtpSocket();
InputStream in = receiver.getInputStream();
while(true) {
    fis.read(buffer, ...);
    // Here some data manipulation on the received buffer ...
    RtpPacket rtp = new RtpPacket(buffer, ...);
    rtpSocket.send(rtpPacket);
}

The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java ). RtpPacketRtpSocket类的实现(相当简单),以及操作视频流内容的确切代码可以在SipDroid项目中找到(特别是VideoCamera.java )。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM