简体   繁体   中英

static_cast from 'const unsigned char *const *' to 'const char *const *' is not allowed

What I'm experiencing is the refusal of my compiler to cast my unsigned char pointer to a signed char pointer. I got confused for a little while, because I had been using static_cast to convert signedness for the longest time .

Then I did a little digging (well, it wasn't very deep. I did a little scooping!) and even though now I understand that the prevention of pointer type casting by static_cast is precisely the reason why it's the safer and better way to cast (than the traditional alternatives which may invoke implementation defined behavior or undefined behavior), I'm still not sure what I should actually do for my situation.

What I have here is a call to an OpenGL API function whose signature is

void glShaderSource(
    GLuint shader, GLsizei count, const GLchar **string, const GLint *length
);

I recently changed the file reader API so that instead of returning the data read from a file as a char * , I do it now with an unsigned char * . This change was not made in error as I feel that an unsigned char is a much better handle to raw data (even though it may well be ASCII data), and indeed void * might be even more clear in this regard.

And then of course I'll be passing the address of this pointer in as the third arg to glShaderSource .

I reckon that it's safe for me to just do the C-style cast to GLchar** , and in fact that is probably the standard answer for this situation. Using reinterpret_cast would just be going above and beyond, but admittedly by only a small amount.

But I'd like to maybe find out a bit more about what the thought process is supposed to be in this situation. Why is it exactly that I am able to dismiss the signedness of the chars at stake here? Is it simply that since I don't expect to ever write a shader that has the high bit set on any of its characters that means I can cast it?

What if i was facing a situation with signed/unsigned integers, and there really are dire consequences of spurious negative integer values becoming interpreted as large positive values? How can I write code here to try to be "safe" about it?

My instinct is telling me that this is obviously impossible without implementing code that actually goes and observes the data itself instead of passing the pointer, so is there just no way to regain the safety of static_cast in this situation, since i'm being forced to work with pointers.

You need to use reinterpret_cast to even convert between char * and unsigned char * . (with or without const). This is because you are treating bits stored as one type, as the bits for a different type; whereas static_cast is for doing value conversions.

As WhozCraig points out, converting between char ** and unsigned char ** is actually aliasing one pointer as another pointer type (so, also requiring reinterpret_cast ).

This could all be a problem in theory, but in terms of practical considerations (IMO), the length of effort you have to go to to support all possibilities is just way too much hassle; for all intents and purposes you can assume that aliasing char as unsigned char gives the same result as value conversions, and similarly for the two pointer types.

You can make the conversion entirely with static_cast if you use a void* intermediate:

unsigned char* src = ...;  // your input
char* srcChar = static_cast<char*>(static_cast<void*>(src));
glShaderSource(..., &src, ...);

I wouldn't say it's nicer than a reinterpret_cast , but at least it shows that a reinterpret_cast is not strictly necessary.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM