简体   繁体   English

ffmpeg H264编码帧一次用于网络流

[英]ffmpeg H264 Encode Frame at a time for network streaming

I'm working on a remote desktop application, I would like to send an encoded H264 packet over TCP by using ffmpeg for the encoding. 我正在开发一个远程桌面应用程序,我想通过使用ffmpeg进行编码来通过TCP发送编码的H264数据包。 However I couldn't find useful info for the particular case of encoding just one frame (already on YUV444) and get the packet. 但是,对于仅编码一帧(已在YUV444上)并获取数据包的特殊情况,我找不到有用的信息。

I have several issues, the first was that: 我有几个问题,第一个是:

avcodec_encode_video2

Was not blocking, I found that most of the time you get the "delayed" frames at the end, however, since this is a real time streaming the solution was: 没有阻塞,我发现大多数情况下您会在最后获得“延迟”帧,但是,由于这是实时流,因此解决方案是:

av_opt_set(mCodecContext->priv_data, "tune", "zerolatency", 0);

Now I got the frame, but several issues, it takes a while and even worse I got a gray with trash pixels video as result. 现在我有了画框,但是有几个问题,要花一段时间,甚至更糟的是,我得到了带有垃圾像素视频的灰色。 My configuration for the Codec Context: 我对编解码器上下文的配置:

 m_pCodecCtx->bit_rate=8000000;
m_pCodecCtx->codec_id=AV_CODEC_ID_H264;
m_pCodecCtx->codec_type = AVMEDIA_TYPE_VIDEO;
m_pCodecCtx->width=1920;
m_pCodecCtx->height=1080;
m_pCodecCtx->pix_fmt=AV_PIX_FMT_YUV444P;
m_pCodecCtx->time_base.num = 1;
m_pCodecCtx->time_base.den = 25;
m_pCodecCtx->gop_size = 1;
m_pCodecCtx->keyint_min = 1;
m_pCodecCtx->i_quant_factor = float(0.71);
m_pCodecCtx->b_frame_strategy = 20;
m_pCodecCtx->qcompress = (float)0.6;
m_pCodecCtx->qmax = 51;
m_pCodecCtx->qmin = 20;
m_pCodecCtx->max_qdiff = 4;
m_pCodecCtx->refs = 4;
m_pCodecCtx->max_b_frames = 1;
m_pCodecCtx->thread_count = 1;

I would like to know how this could be done, how do I set the "I Frames"? 我想知道如何做到,如何设置“ I Frames”? and, that would be the optimal for a "one at a time" encoding? 并且,这对于“一次一个”的编码而言将是最佳选择? Also I'm not concerned right now with the quality, just need to be fast enough (under 16 ms). 另外,我现在不关心质量,只需要足够快(不到16毫秒)即可。

For the encoding part: 对于编码部分:

nres = avcodec_encode_video2(m_pCodecCtx,&packet,m_pFrame,&framefinished);

if(nres<0){
    qDebug() << "error encoding: " << nres << endl;
}

if(framefinished){
    m_pFrame->pts++;
     ofstream vidout("video.h264",ios::app);
     if(vidout.good()){
         vidout.write((const char*)&packet.data[0],packet.size);
     }
     vidout.close();

     av_packet_unref(&packet);

}

I'm not using a container, just a raw file, ffplay reproduce raw files if the packets are right, and that's my principal issue. 我使用的不是容器,而是原始文件,如果数据包正确,ffplay会复制原始文件,这是我的主要问题。 I'm planning to send the packet over tcp and decode on the client. 我打算通过tcp发送数据包并在客户端上解码。 Any help would be greatly appreciated. 任何帮助将不胜感激。

You could take a look at the source code of webrtc . 您可以看一下webrtc的源代码。 It use openh264 and ffmpeg to accomplish your work. 它使用openh264和ffmpeg完成您的工作。

I was study in it for a while. 我在里面学习了一段时间。 But I can't the the latest source code currently. 但是我目前无法获得最新的源代码。

I found this : source code . 我发现了这一点: 源代码

Hope it helps. 希望能帮助到你。

Turns out I got it working since the beginning, I made very simple but important mistake, I was writing as text a binary file, so... 事实证明,我从一开始就开始工作,我犯了一个非常简单但重要的错误,我以文本形式编写二进制文件,所以...

Thanks for the feedback and your help 感谢您的反馈和帮助

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM