简体   繁体   中英

Passing byte array from C# to C++ DLL as char*

I am passing a byte[] from C# to C++ DLL

Inside the C++ DLL, I need to call a function which accept and read istream object, I intend to receive the byte[] from C# as char* and convert it to istream ,

C++ DLL

extern "C" _declspec(dllexport) bool CheckData(char* data, int dataLength)

C#

[DllImport("example.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern bool CheckData(byte[] incoming, int size);

public void Process(byte[] bytes)
{
    CheckData(bytes, bytes.Length);
}

Although it seems to work fine, I find that the equivalent data type of byte[] is unsigned char* in C++, I thought of changing to unsigned char* but most stream in C++ works on char* not unsigned char*

I would like to ask

1) Both data type char* and unsigned char* are 1 byte, what happened behind? Is there any potential problem if I keep using byte[] with char* ?

2) In case there is any problem, how should I use unsigned char* to construct an istream object?

Actually, the char * and unsigned char * type sizes are not 1 byte, but rather 4-bytes, assuming we are talking about a win32 application : those are pointers, and all pointers have the same size regardless of the size of the data being pointed at.

When the P/Invoke mechanism sees an array of "simple values" as a function argument, it happily feeds a pointer to the start of the array to the C function underneath. After all, all it really knows about the C function from the info in the DLL is where its code starts. As far as I know, the number and type of arguments is not encoded in the symbol name, so it trusts the info you provided. Which means that even if you'd fed it an int array, the actual call to the C function would have worked, as the size of the arguments pushed on the stack (a pointer and an int) match the ABI of the function. Of course the processing would have probably been wrong as the size wouldn't have matched.

See also https://msdn.microsoft.com/en-us/library/75dwhxf7(v=vs.110).aspx for more details about what happens.

Processing is where the difference between unsigned char and char comes : if on the C# size you do some math on the byte values (ranging 0-255), pass it on the C side where char values (-128 to 127) are expected to do some more math, something could go wrong. If it just uses it as a way to move data around, it's all fine.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM