I would like the compute the Gradient
and Hessian
of the following function with respect to the variables x
and y
. Anyone could help? Thanks a lot.
I find a code relevant from github for calculation of Rosenbrock function.
def objfun(x,y):
return 10*(y-x**2)**2 + (1-x)**2
def gradient(x,y):
return np.array([-40*x*y + 40*x**3 -2 + 2*x, 20*(y-x**2)])
def hessian(x,y):
return np.array([[120*x*x - 40*y+2, -40*x],[-40*x, 20]])
Update:
from sympy import symbols, hessian, Function, N
x, y = symbols('x y')
f = symbols('f', cls=Function)
f = (1/2)*np.power(x, 2) + 5*np.power(y, 2) + (2/3)*np.power((x-2), 4) + 8*np.power((y+1), 4)
H = hessian(f, [x, y]).subs([(x,1), (y,1)])
print(np.array(H))
print(N(H.condition_number()))
Ouput:
[[9.00000000000000 0]
[0 394]]
43.7777777777778
How to get the Gradient and Hessian | Sympy https://docs.sympy.org/dev/modules/vector/fields.html
There is the hessian
function for expressions and the jacobian
method for matrices.
Here are the function and variables of your problem:
>>> from sympy.abc import x, y
>>> from sympy import ordered, Matrix, hessian
>>> eq = x**2/2 + 5*y**2 + 2*(x - 2)**4/3 + 8*(y + 1)**4
>>> v = list(ordered(eq.free_symbols)); v
[x, y]
We can write our own helper for gradient which will create a matrix and use the jacobian
method on it:
>>> gradient = lambda f, v: Matrix([f]).jacobian(v)
Then the quantities can be calculated as:
>>> gradient(eq, v)
Matrix([[x + 8*(x - 2)**3/3, 10*y + 32*(y + 1)**3]])
>>> hessian(eq, v)
Matrix([
[8*(x - 2)**2 + 1, 0],
[ 0, 96*(y + 1)**2 + 10]])
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.