简体   繁体   中英

Textures not working in opengl/Tao for C# on intel chipsets

I've got a lot of OpenGl rendering code in a C# .NET 3.5 sp1 application. Some of the code can be found in this question .

Huge problem: The code is not running at all on intel chipsets, such as the 915M or the Q35. These chipsets are spec'd to run opengl 1.4, and my code is all power-of-two nice and so forth. I've tried updating to the latest drivers from either Dell or Intel, depending on the machine.

There are two types of crash failures:

  1. Failing on glActiveTextureARB. 915M apparently does not include that in its extensions.
  2. Failing on shader loading, namely glGenProgramsARB.

Trying to call these functions on the 915M in XP or the Q35 on Windows 7 causes either a freeze or a crash, depending on what the machine is feeling like at the moment.

There's another worse failure, and that's that the display is not fixing that previous question, ie, the rendered display looks like it's showing 8 bit data when it should be showing 16. That's happening on the Q35 chipset, latest dell drivers, running XP.

Any thoughts on this? I'm considering going back to glDrawPixels, because even while that's slow, it works.

EDIT: more code! The following code fails with an GL_INVALID_VALUE error on both a windows xp a windows 7 machine with the Q35 chipset:

    const int mSize = 256;
    ushort[] theImage = new ushort[mSize * mSize];
...
 //datasetup
        int x, y, i = 0 ;
        for (x = 0; x < mSize; x++) {
            for (y = 0; y < mSize; y++) {
                theImage[(y * mSize) + x] =  (ushort)i++;
            }
        }
...
 //onpaint
        int theTexName;
        Gl.glGenTextures(1, out theTexName);
        Gl.glBindTexture(Gl.GL_TEXTURE_2D, theTexName);
        Gl.glTexImage2D(Gl.GL_TEXTURE_2D, 0, Gl.GL_LUMINANCE16, mSize, mSize,
            0, Gl.GL_LUMINANCE, Gl.GL_UNSIGNED_SHORT, theImage);
        Gl.glTexParameteri(Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MIN_FILTER, Gl.GL_LINEAR);
        Gl.glTexParameteri(Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MAG_FILTER, Gl.GL_LINEAR);
        //theMagBuffer = null;


        Gl.glDisable(Gl.GL_BLEND);


        Gl.glEnable(Gl.GL_TEXTURE_2D);
        Gl.glPolygonMode(Gl.GL_FRONT_AND_BACK, Gl.GL_FILL);
        Gl.glColor4d(1.0, 1.0, 1.0, 1.0);
        Gl.glPushMatrix();

        Gl.glBindTexture(Gl.GL_TEXTURE_2D, theTexName);
        Gl.glBegin(Gl.GL_QUADS);
        Gl.glTexCoord2f(0, 1);
        Gl.glVertex2f(0, mSize);
        Gl.glTexCoord2f(0, 0);
        Gl.glVertex2f(0, 0);
        Gl.glTexCoord2f(1, 0);
        Gl.glVertex2f(mSize, 0);
        Gl.glTexCoord2f(1, 1);
        Gl.glVertex2f(mSize, mSize);
        Gl.glEnd();

        Gl.glDisable(Gl.GL_TEXTURE_2D);
        Gl.glDisable(Gl.GL_FRAGMENT_PROGRAM_ARB);

        Gl.glPopMatrix();
        GetGLError("MAG");
        Gl.glFinish();

Cheers,

sadly, you cannot do anything about the crashes, it seems like the extensions in question are not supported on those chipsets. Your app is not compatible with them, end of story. Just to make sure, use glGetString(GL_EXTENSIONS) to see what extensions are there. You can even do that in your application on startup and display a nice message box telling the user that some extensions that are required to run are not supported by the hardware, instead of just crashing.

As for the 8bit / 16bit question, using LUMINANCE16 doesn't really guarantee that 16 bit texture will be used, many drivers "emulate" this format using 8 bit textures without telling you. This is used to improve compatibility, as annoying as it is.

If you want true 16 bit, use 8 bit LUIMNANCE_ALPHA which should be suported pretty much anywhere and pack the two 8 bit components to a single number inside shader.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM