简体   繁体   中英

given 4 points on a textured sphere extract a 2d plane and it's projected texture

Setting the scene I'm working on a feature in scenekit where i have a camera at the center of a sphere. The sphere has a texture wrapped around it. Let's say it was a 360 degree image captured inside of a room.

So far I have identified the positions on the sphere that correspond to the corners of the floor. I can extract and create a new flat 2d plane that matches the dimensions of the floor from the camera's perspective. Eg If the room had a long rectangular floor, I'd create a trapezoid shaped plane.

Problem But I would like for the new 2d plane to have the texture of the floor, not just the shape. How do I do this given that what I want to extract is not the original texture image, but the result of its projection onto the sphere?

FYI I'm pretty new to scenekit and 3d graphics stuff and I'm even newer to opengl

I assume that your image is structured in a way that lets you directly pick a pixel given an arbitrary direction. Eg if the azimuth of the direction is mapped to the image's x-coordinate and the height of the direction to the image's y-coordinate, you would convert the direction to these parameters and pick the color at those coordinates. If that is not the case, you have to find the intersection of the according ray (starting at the camera) with the sphere and find the texture coordinate at that intersection. You can then pick the color using this texture coordinate.

Now, you have basically two options. The first option is generating a new texture for the plane. The second option is sampling the spherical image from a shader.

Option 1 - Generate a new texture

You know the extent of your plane, so you can generate a new texture whose dimensions are proportional to the plane's extents. You can use an arbitrary resolution. All you then need to do is fill the pixels of this texture. For this, you just generate the ray for a given pixel and find the according color in the spherical image like so:

input: d1, d2, d3, d3 (the four direction vectors of the plane corners)
//  d3 +------+ d4
//  d1 +------+ d2
for x from 0 to texture width
    for y from 0 to texture height
        //Find the direction vector for this pixel through bilinear interpolation
        a = x / (width - 1) //horizontal interpolation parameter
        b = y / (height - 1) //vertical interpolation parameter
        d = (1 - a) * ((1 - b) * d1 + b * d3) + a * ((1 - b) * d2 + b * d4)
        normalize d
        //Sample the spherical image at d
        color = sample(d)
        //write the color to the new planar texture
        texture(x, y) = color
    next
next

Then, you have a new texture that you can apply to the plane. Barycentric interpolation might be more appropriate if you express the plane as two triangles. But as long as the plane is rectangular, the results will be the same.

Note that the sample() method depends on your image structure and needs to be implemented appropriately.

Option 2 - Sample in a shader

In option 2, you do the same thing as in option 1. But you do it in a fragment shader. You employ the vertices of the plane with their respective directions (this might be just the vertex position) and let the GPU interpolate them. This gives you directly the direction d , which you can use. Here is some pseudo shader code:

in vec3 direction;
out vec4 color;
void main()
{
    color = sample(normalize(direction));
}

If your image is a cube map, you can even let the GPU do the sampling.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM