简体   繁体   中英

LibGDX : Rendering a TextureRegion renders whole texture

SO I have a large Texture that is broken up in to 64x64 blocks.

I load this in to LibGDX using

texture = new Texture("texturemap.png");
regions = TextureRegion.split(texture, 64, 64);

I create a Cube (mesh)

modelBuilder = new ModelBuilder();
cube = modelBuilder.createBox(Constants.cubeSize, Constants.cubeSize, Constants.cubeSize,
                new Material(ColorAttribute.createDiffuse(Color.BLUE)),
                VertexAttributes.Usage.Position | VertexAttributes.Usage.Normal | VertexAttributes.Usage.TextureCoordinates);

In my Rendering look I want to choose what Texture region to render however it always renders the whole texture

Gdx.gl20.glEnable(GL20.GL_TEXTURE_2D);
Gdx.gl20.glEnable(GL20.GL_BLEND);
Gdx.gl20.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
Gdx.gl20.glCullFace(GL20.GL_BACK);
Gdx.gl20.glEnable(GL20.GL_DEPTH_TEST);


shaderProgram.begin();
//texture.bind();
regions[1][5].getTexture().bind();
shaderProgram.setUniformMatrix("u_projTrans", camera.combined);
shaderProgram.setAttributef("a_color", 1, 1, 1, 1);
shaderProgram.setUniformi("u_texture", 0);

        for (int i = 0; i < chunks2.size; i ++ ) {
            chunks2.get(i).render(shaderProgram);
        }

   shaderProgram.end();

Am I missing the point to TextureRegions? It always renders the whole texture?

You are only using the region to get the Texture it points to. You are never actually using the TextureRegion it self.

A TextureRegion defines, well, a region of a texture. For that it refers to a Texture as well as the location and size of the region on that texture. For rendering you typically want this location and size to be normalized, also called UV coordinates. You can get those from the TextureRegion by using the getU() and getV() methods for the first corner and the getU2() and getV2() methods for the other corner. You can get texture, as you already found, using the getTexture() method.

There are practically two ways how you can do that:

a) Bake the texture region onto the mesh, practically scaling and moving the UV coordinates of your Mesh. Since you're using ModelBuilder , you could that using the setUVRange method when creating the part. The createBox method you're using is convenience method, that you probably want to avoid using unless you're debugging. Check it's source to see what it actually does. In your case you could use this instead:

modelBuilder.begin();
MeshPartBuilder mpb = modelBuilder.part("box", primitiveType, attributes, material);
mpb.setUVRange(yourRegion);
mpb.box(width, height, depth);
cube = modelBuilder.end();

b) Scale the coordinates in your shader. For this you provide the region, in the form of an translation and scaling to your shader as uniforms. Note that this implies that you need a separate render call for each separate region, so if you're using TextureRegions because you've packed your textures in the hope to gain on performance, then this is not the option that you want.

The DefaultShader (eg used by ModelBatch ) supports TextureRegion for diffuse textures by default. Here is an example of that. Since you're using your own shader, you'd have to implement that in that shader. It is typically a one-liner in your vertex shader, eg:

v_diffuseUV = u_diffuseUVTransform.xy + a_texCoord0 * u_diffuseUVTransform.zw;

Where u_diffuseUVTransform is a vec4 containing U and V (.xy) and U2-U and V2-V (.zw).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM