简体   繁体   English

OpenGL - 第一人称相机矩阵问题

[英]OpenGL - Trouble with first person camera matrix

I have tried my best to create a camera that mimics the style of a first person camera. 我尽力创造一个模仿第一人称相机风格的相机。 I have just switched from the old OpenGL method of rendering and am now ready to tackle a camera matrix. 我刚刚从旧的OpenGL渲染方法切换到现在准备处理相机矩阵。 Here is my code for my camera update. 这是我的相机更新代码。

void Camera::update(float dt)
{
// Get the distance the camera has moved
float distance = dt * walkSpeed;

// Get the current mouse position
mousePos = mouse->getPosition();

// Translate the change to yaw and pitch
angleYaw -= ((float)mousePos.x-400.0f)*lookSpeed/40;
anglePitch -= ((float)mousePos.y-300.0f)*lookSpeed/40;

// Clamp the camera to a max/min viewing pitch
if(anglePitch > 90.0f)
    anglePitch = 90.0f;

if(anglePitch < -90.0f)
    anglePitch = -90.0f;

// Reset the mouse position
mouse->setPosition(mouseReset);

// Check for movement events
sf::Event event;
while (window->pollEvent(event))
{

    // Calculate the x, y and z values of any movement
    if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::W)
    {
        position.x -= (float)sin(angleYaw*M_PI/180)*distance*25;
        position.z += (float)cos(angleYaw*M_PI/180)*distance*25;
        position.y += (float)sin(anglePitch * M_PI / 180) * distance * 25;
        angleYaw = 10.0;
    }
    if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::S)
    {
        position.x += (float)sin(angleYaw*M_PI/180)*distance*25;
        position.z -= (float)cos(angleYaw*M_PI/180)*distance*25;
        position.y -= (float)sin(anglePitch * M_PI / 180) * distance * 25;
    }
    if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::R)
    {
        position.x += (float)cos(angleYaw*M_PI/180)*distance*25;
        position.z += (float)sin(angleYaw*M_PI/180)*distance*25;
    }
    if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::A)
    {
        position.x -= (float)cos(angleYaw*M_PI/180)*distance*25;
        position.z -= (float)sin(angleYaw*M_PI/180)*distance*25;
    }
}

// Update our camera matrix
camMatrix = glm::translate(glm::mat4(1.0f), glm::vec3(-position.x, -position.z, -position.y));
camMatrix = glm::rotate(camMatrix, angleYaw, glm::vec3(0, 1, 0));
camMatrix = glm::rotate(camMatrix, anglePitch, glm::vec3(1, 0, 0));
}

The last 3 lines are what I assumed would update the camera with the opposite of the translation (y, and z switched for the format I am working with). 最后3行是我假设将更新相机与翻译相反(y和z切换为我正在使用的格式)。 Did I do them in the wrong order? 我是以错误的顺序做的吗?

Here is my very simple shader: 这是我非常简单的着色器:

#version 120

attribute vec4 position;
uniform mat4 camera;

void main()
{
    gl_Position = position * camera;
}

#version 120
void main(void)
{
    gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}

This simply makes a red triangle. 这只是一个红色三角形。 The camera sort of rotates around the triangle, which is not what I want. 相机在三角形周围旋转,这不是我想要的。 I want it to rotate the camera. 我想让它旋转相机。 I thought multiplying a camera matrix by each vertex would give the rendering in camera space. 我认为将相机矩阵乘以每个顶点会在相机空间中进行渲染。 Or do I need to multiply it by the projection matrix as well? 或者我还需要将它乘以投影矩阵?

Moving w, a, s, or d zooms in really close all at once and distorts the whole view with red fragments everywhere. 移动w,a,s或d一下子变得非常接近,并且在整个视图中扭曲整个视图,到处都是红色碎片。

Write your matrix operations in reverse order. 以相反的顺序编写矩阵运算。 So if you want to translate (to camera position) and then rotate, write it in this order: 因此,如果要转换(到相机位置)然后旋转,请按以下顺序写入:

// Update our camera matrix
camMatrix = glm::rotate(glm::mat4(1.0f), anglePitch, glm::vec3(1, 0, 0));
camMatrix = glm::rotate(camMatrix, angleYaw, glm::vec3(0, 1, 0));
camMatrix = glm::translate(camMatrix, glm::vec3(-position.x, -position.z, -position.y));

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM