简体   繁体   English

在具有自定义数据类型的 python 包装 C 函数上使用 autograd

[英]Using autograd on python wrapped C function with custom datatype

Can autograd in principle work on python wrapped C functions? autograd 原则上可以在 python 包装的 C 函数上工作吗?

The C function I'd like to differentiate expects arguments with a REAL8 data type, and I can successfully call it in python by giving it either float or np.float64 arguments.我想区分的 C 函数期望参数为REAL8数据类型,我可以通过给它floatnp.float64参数在 python 中成功调用它

Upon closer inspection with my debugger, I found that I get a TypeError when autograd changes the float or np.float64 to an ArrayBox object to when it attempts to evaluate the gradient.用我的调试器仔细检查后,我发现当 autograd 将floatnp.float64更改为ArrayBox对象时,当它尝试评估渐变时,我会收到TypeError

Is there a way to allow autograd to proceed in this scenario?有没有办法让 autograd 在这种情况下继续进行?

Is there another strategy that might be better for my situation?是否有另一种策略可能更适合我的情况?

Are there any primitives I can code up to fix this?有没有我可以编写代码来解决这个问题的原语?

Background:背景:

Bilby is a newer python library that wraps an older code (LALSIMULATION) written in C. It provides extensive precoded gravitational wave models, and I would like to use these pre-coded models in my research. Bilby 是一个较新的 Python 库,它封装了用 C 编写的旧代码 (LALSIMULATION)。它提供了广泛的预编码引力波模型,我想在我的研究中使用这些预编码模型。 My first task is to figure out how to calculate accurate Hessian's and gradients of these models.我的第一个任务是弄清楚如何计算这些模型的准确 Hessian 和梯度。 I'd like to use automatic differentiation to solve this problem due to its infamous numerical accuracy, but I'm stuck.由于其臭名昭著的数值精度,我想使用自动微分来解决这个问题,但我被卡住了。

import autograd.numpy as np
from autograd import grad
import BILBY_TAYLORF2 # My own class that wraps Bilby

model = BILBY_TAYLORF2()

gradient_likelihood = grad(model.logLikelihood)

gradient_likelihood(np.array([10., 10.]))
TypeError: in method 'SimInspiralChooseFDWaveform', argument 3 of type 'REAL8'

SimInspiralChooseFDWaveform is the first call to C code for reference. SimInspiralChooseFDWaveform 是第一次调用 C 代码以供参考。

I'm afraid this answer is probably far too late for the original poster, but in case others come across it here's some information.恐怕这个答案对于原始海报来说可能为时已晚,但如果其他人遇到它,这里有一些信息。

Autograd relies on functions being constructed of other functions for which analytical derivatives have been defined. Autograd依赖于由已定义解析导数的其他函数构造的函数。 In autograd's case it knows the derivatives for standard operators and it also redefines many numpy functions to include their analytical derivatives (or vector-Jacobian products ).在 autograd 的情况下,它知道标准运算符的导数,并且它还重新定义了许多 numpy 函数以包括它们的分析导数(或向量雅可比积)。 It keeps track of all the functions/operations that have been put together and performs automatic differentiation by repeated use of the chain rule.它跟踪已组合在一起的所有功能/操作,并通过重复使用链式法则执行自动区分。

The swig wrapped lalsimulation functions just return a waveform. swig 包裹的 lalsimulation 函数只返回一个波形。 They do not have a gradient function defined (or a gradient for the dot product of the waveform with the data and itself, as would be required for the likelihood calculation in bilby), so autograd doesn't know what to do with them.它们没有定义梯度函数(或波形与数据及其本身的点积的梯度,正如 bilby 中的似然计算所需要的那样),因此 autograd 不知道如何处理它们。 The autograd tutorial shows's how you could define a primative function, define the vector-Jacobian product and tell autograd to link the two with the defvjp function. autograd 教程展示了如何定义primative函数、定义向量雅可比积并告诉 autograd 将两者与defvjp函数联系起来。 So, you would have to create your own functions that wrap the likelihood function and provide the gradient function.因此,您必须创建自己的函数来包装似然函数并提供梯度函数。 As you are treating the lalsimuation/likelihood functions as black boxes you don't know an analytical form of the gradient function, so it would just have to use a simple method like finite difference to get the gradient (maybe with some iteration).当您将 lalsimuation/似然函数视为黑盒时,您不知道梯度函数的解析形式,因此只需使用有限差分等简单方法即可获得梯度(可能需要一些迭代)。 An example of this sort of thing, ie, defining a gradient for a black box function, is given in this example for PyMC3, which could probably be adapted for autograd.此类事情的一个示例,即为黑盒函数定义梯度,在此示例中针对 PyMC3 给出,它可能适用于 autograd。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM