简体   繁体   中英

C# app calling 64-bit C/C++ DLL crashes, but 32-bit version runs

My C/C++ DLL is built for 32- and 64-bit Windows environments. Two programmers working in .NET (C# and VB.NET) say when they build their clients for 32-bit, everything runs correctly. But when they build for 64-bit, the run ends in an access violation.

I am not a .NET programmer, but I can load their process and step through the DLL code in C++ debugger. The .NET run definitely encounters corrupt memory that does not occur when the client is one I wrote in C++/VCL.

My question: Is there something special in the .NET project settings for interfacing to a 64-bit C/C++ DLL that differs from what works under 32-bit ?

Possibly relevant but I can't prove it: The DLL is built with Character Setting=Not Set for UTF-8/ASCII. I don't know whether the VB.NET client is built for Unicode, but I assume that would create problems.

EDITED TO ADD declarations from the VB.NET, C# clients and the C++ DLL

Declare Function Edit_RunEdits Lib "EDITS50.DLL" (ByVal smfID As Integer, _
                                                  ByVal edit_set_tag As String, _
                                                  ByVal layout_tag As String, _
                                                  ByVal data As String, _
                                                  ByVal edit_options As Integer, _
                                                  ByRef errors_count As Integer, _
                                                  ByVal owner As IntPtr, _
                                                  ByVal callback_func As IntPtr) As Integer

This the C# declaration:

[DllImportAttribute("EDITS50.dll", EntryPoint = "Edit_RunEdits")]
    public static extern int Edit_RunEdits(int smfID, 
         [InAttribute()] [MarshalAsAttribute(UnmanagedType.LPStr)] string edit_set_tag, 
         [InAttribute()] [MarshalAsAttribute(UnmanagedType.LPStr)] string layout_tag, 
         [InAttribute()] [MarshalAsAttribute(UnmanagedType.LPStr)] string data, 
         int edit_options, 
         ref int errors_count, 
         System.IntPtr owner, 
         System.IntPtr callback_func);

This is the C++ side:

#define EDIT_API  __declspec(dllexport) __stdcall
extern "C" int EDIT_API Edit_RunEdits( const int smfID, const char* edit_set_tag, 
const char* layout_tag, 
const char* data, 
const int edit_options, 
int* errors_count, 
void* owner, 
void* callback_func);

I believe I may have found the problem. The project was built for AnyCPU, but under the x64 settings the Prefer32Bit setting was "true". I am asking the .NET programmers to try building explicitly for x64, and to look for and override all instances of Prefer32Bit (set them to "false").

Thank you, everyone, for your feedback.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM