简体   繁体   English

OpenGL渲染(仅实时)?

[英]OpenGL rendering (only real-time)?

I understand that you usually create complex 3D models in Blender or some other 3D modelling software and export it afterwords as .obj. 我了解您通常会在Blender或其他一些3D建模软件中创建复杂的3D模型,并将其后缀导出为.obj。 This .obj file gets parsed into your program and openGL will render it. 该.obj文件将解析到您的程序中,并且openGL将呈现它。 This as far as I understand real-time rendering. 据我了解的实时渲染。

Now I was wondering if there is something like pre-rendered objects. 现在我想知道是否存在类似预渲染的对象。 I'm a little bit confused because there are so many articles/videos about real-time rendering but I haven't found any information about none real-time rendering. 我有点困惑,因为有太多关于实时渲染的文章/视频,但是我还没有找到关于没有实时渲染的任何信息。 Does something like this exists or not? 是否存在这样的东西? The only thing which would come into my mind as none real-time rendering would be a video. 因为没有实时渲染,唯一会想到的就是视频。

I guess this is pretty much a yes or no question :) but if it exists maybe someone could point me to some websites with explanations. 我想这几乎是一个是或不是的问题:)但是,如果存在,也许有人可以向我指出一些解释的网站。

"Real-time rendering" means that the frames are being generated as fast as they can be displayed. “实时渲染”表示正在以尽可能快的显示速度生成帧。 "Non-real-time rendering", or "offline rendering" means generating frames one at a time, taking as much time as necessary to achieve the desired image quality, and then later assembling them into a movie. “非实时渲染”或“脱机渲染”是指一次生成一个帧,花费尽可能多的时间来获得所需的图像质量,然后将它们组装成电影。 Video at the quality of video games can be rendered in real time; 具有视频游戏质量的视频可以实时渲染; something as elaborate as a Pixar movie, though, has to be done in offline mode. 不过,必须在离线模式下完成一些像皮克斯电影一样精致的事情。 Individual frames can still take hours of rendering time! 单个帧仍需要花费数小时的渲染时间!

It's not entirely clear what you mean by "prerendered objects", however there are things called VBOs and Vertex Arrays that store the object's geometry in VRAM so as to not have to load it into the rendering pipeline using glVertex3f() or similar every frame. 还不清楚“预定义对象”是什么意思,但是有一些称为VBO和顶点数组的东西将对象的几何存储在VRAM中,从而不必使用glVertex3f()或类似的每一帧将其加载到渲染管道中。 This is called Immediate Mode. 这称为立即模式。

VBOs and Vertex arrays are used instead of immediate mode because they're far faster than calling the graphics driver to load data into VRAM for every vertex because they are kept in VRAM, which is faster than normal RAM, ready to be booted into the render pipeline. 维也纳各组织和顶点数组来代替直接的方式,因为他们远远超过调用显卡驱动程序将数据加载到VRAM每一个顶点,因为他们一直在VRAM,比正常的RAM要快,随时可以启动到渲染管道。

The page here may help, too. 这里的页面也可能有帮助。

There's nothing stopping you from rendering to an off-screen frame-buffer (ie, an FBO) and then saving that to disk rather than displaying it to the screen. 没有什么可以阻止您渲染到屏幕外的帧缓冲区(即FBO) ,然后将其保存到磁盘而不是将其显示在屏幕上。 For instance, that's how GPGPU techniques used to work before the advent of CUDA, OpenCL, etc. ... You would load your data as an unfiltered floating point texture, perform your calculation using pixel-shaders on the FBO, and then save the results back to disk. 例如,这就是GPGPU技术在CUDA,OpenCL等出现之前的工作方式。...您将数据作为未过滤的浮点纹理加载,使用FBO上的像素着色器执行计算,然后保存结果返回磁盘。

In the link I posted above, it states in the overview: 在我上面发布的链接中,它在概述中指出:

This extension defines a simple interface for drawing to rendering destinations other than the buffers provided to the GL by the window-system. 该扩展定义了一个简单的接口,用于绘制到渲染目标,而不是窗口系统提供给GL的缓冲区。

It then goes on to state, 然后继续说明

By allowing the use of a framebuffer-attachable image as a rendering destination, this extension enables a form of "offscreen" rendering. 通过允许使用可附加帧缓冲区的图像作为渲染目标,此扩展启用了“屏幕外”渲染的形式。

So, you would get your "non-real-time" rendering by rendering off-screen some scene that renders slower than 30fps, and then saving those results to some movie file or file-sequence format that can be played back at a later date. 因此,通过在屏幕外渲染一些低于30fps的场景,然后将这些结果保存到某些电影文件或文件序列格式中,可以在以后播放,从而获得“非实时”渲染。 。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM