简体   繁体   中英

OpenGL equivalent to GL_POINT_SIZE_ARRAY_OES?

I'm trying to draw point sprites in a small Mac app. I want each sprite to have its own size, and I know that OpenGL ES has the client state "GL_POINT_SIZE_ARRAY_OES".

I did some googling and discovered that there is a similar value "GL_POINT_SIZE_ARRAY_APPLE" which (you'd think) should do the same thing. For some reason, though, it doesn't seem to. Here's my drawing code:

glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_POINT_SIZE_ARRAY_APPLE);

glVertexPointer(2, GL_FLOAT, sizeof(SpriteData), spriteVertices);
glPointSizePointerAPPLE(GL_FLOAT, sizeof(SpriteData), spriteVertices + sizeof(LocationF));

glDrawArrays(GL_POINTS, 0, spriteCount);

glDisableClientState(GL_POINT_SIZE_ARRAY_APPLE);
glDisableClientState(GL_VERTEX_ARRAY);

SpriteData is a struct containing the vertex/size data of each sprite. spriteVertices is just an interleaved array of that struct.

The vertex pointer is working fine; it's drawing the sprites, but seems to be ignoring their size values. It instead defaults to the value set by glPointSize().

Despite the fact that this code compiles with no warnings, it seems very suspicious to me that googling "GL_POINT_SIZE_ARRAY_APPLE" brings up almost no results. Is this a useless parameter? If so, how else can I achieve what I want?

There is no official OpenGL extension which exposes a GL_POINT_SIZE_ARRAY_APPLE extension. This may be some detritus in Apple's headers, but you shouldn't use it. Just use a generic vertex array and use the value you pass as a point size.

If you want cross-platform code, you should avoid system-dependent headers. Instead, use a proper OpenGL loader , which comes with cross-platform headers that won't have system-dependent, non-standard detritus in them.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM