简体   繁体   中英

C++ dll returns different results when called from Excel VBA/Python

Suppose my VC100 Dll project has the following:

Foo.h

struct Foo {
    std::vector v;
    ...

    Foo(const double* x, const long& n, ...)
        : v(x, x + n),
          ... {; }

    double bar() {...} // Does something to this->v, returns a double
};

myproject.cpp

#include "Foo.h"    

double fun(double* x, long n, ...) {

    Foo f(x, n, ...);

    const double r(f.bar());

    std::memcpy(x, f.v.data(), n * sizeof(double));

    return r;
}

inline double __stdcall xfun(double* x, long n, ...) {
    return fun(x, n, ...);
}

def.def

library "myproject"
EXPORTS
fun
xfun

mywb.xlsm (VBA)

Declare Function xl_fun Lib "myproject.dll" Alias "xfun" _
    (ByRef x As Double, ByVal n As Integer, ...) As Double

xl_fun() is then called from Excel, and fun() is called from Python 2.7` after.

My question is, xl_fun() will return with its first argument updated to the contents of Foo.v , just as Python will.

However, when fun() is called from Python 2.7 , the first argument update, and the return value is different.

Without going into too much detail, this update and return are better numbers than the Excel ones.

All exceptions are handled by the time f.bar() returns.

I've stepped through and confirmed the inputs passed to Foo 's constructor are identical in both cases. I've also confirmed Foo.v 's initial state is identical in both cases. How can the same DLL produce different results given the same inputs?

Any help is much appreciated.

Excel sets FPU flags for extended precision (80-bits) while most other run-time environments don't. This may play a role in the differences you observe.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM