简体   繁体   中英

Getting Memory Exception Error when calling C++ method from C#

I have C++ dll which has below method signature. I would like to know what should be compatible C# method to call C++ dll. I am getting error on execution as attempted to read write protected memory. The dll used here is third party dll. I am able to call simple methods without pointers from C++ dll.

C++ Declaration:

int __stdcall Deduct(int V, unsigned char *AI);

C# Declaration

 [System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct@8", CallingConvention = CallingConvention.Cdecl)]   
 public static extern long Deduct(long i,ref string AI);

As per the document of third party dll.

AI Used as input and output buffer. The buffer shall have at least 80 bytes Input Additional Information for the transaction. It is a byte pointer containing 7 bytes.

eg Assume the usage information is not used,

if receipt number = 1234567,
hex value = 0x12D687,
7 bytes AI = D6 87 00 00 00 D6 87
Output On return, the AI contains the UD.

Please help.

I think the c# signature looks wrong.

 [System.Runtime.InteropServices.DllImport("rwl.dll",EntryPoint = "_Deduct@8", CallingConvention = CallingConvention.Cdecl)]   
 public static extern long Deduct(int i,ref char[] AI);

if it is expecting a char[80] as it's input you may need to declare it as such and null terminate it yourself as whatever is reading from it might be reading past the end of the array.

usage

char[] tmp = new char[80];
tmp[0] = (char)0x00;
int i = 0;
Deduct(i, ref tmp);

I'm not sure if this will work but hopefully it helps.

Besides string marshaling, I found a likely reason for getting an exception.

You have specified the return type and the first parameter as int in C, but long int C#. This could cause problems.

C language specifies minimum bits of integer types which means that different for environments.

  • int in C : unsigned integer, at least 16 bits
  • long in C : unsigned integer, at least 32 bits
  • int in C# : 32-bit unsigned integer, always
  • long in C# : 64-bit unsigned integer, always

Most of major 32/64-bit OS/compilers use 32 bits for type int in C. However long is always 64-bit in C#. So the signature mismatches. I would suggest to revise:

public static extern int Deduct(int i, ref string AI); // Still not sure about string

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM