So I have a function that takes four numerical arguments and produces a numerical argument.
f(w,x,y,z) --> A
If I have the function f
and a target result A
, is there an iterative method for discovering parameters w,x,y,z
that produce a given number A
?
If it helps, my function f
is a quintic bezier where most of the parameters are determined. I have isolated just these four that are required to fit the value A
.
Q(t)=R(1−t)^5+5S(1−t)^4*t+10T(1−t)^3*t^2+10U(1−t)^2*t^3+5V(1−t)t^4+Wt^5
R,S,T,U,V,W
are vectors where R
and W
are known, I have isolated only a single element in each of S,T,U,V
that vary as parameters.
If you can impose 3 (or more) additional equations that you know (or suspect) must be true for your 4-variable solution that gives target value A
, then you can try applying Newton's method for solving a system of k equations with k unknowns. Otherwise, without a deeper understanding of the structure of the function you are trying to make equal to A
, the only general type of technique I'm aware of that's easy to implement is to simply define the error function as g(w,x,y,z) = |f(w,x,y,z) - A|
and search for a minimum of g
. Typically the "minimum" found will be a local minimum, so it may require many restarts of the minimization problem with different starting values for your parameters to actually find a solution that gives a local minimum you want of g = 0
. This is very easy to implement and try in a few lines eg in MATLAB using fminsearch
The set of solutions of the equation f(w,x,y,z)=A
(where all of w
, x
, y
, z
and A
are scalars) is, in general, a 3 dimensional manifold (surface) in the 4-dimensional space R^4
of (w,x,y,z)
. Ie, the solution is massively non-unique.
Now, if f
is simple enough for you to compute its derivative, you can use the Newton's method to find a root: the gradient is the direction of the fastest change of the function, so you go there.
Specifically, let X_0=(w_0,x_0,y_0,z_0)
be your initial approximation of a solution and let G=f'(X_0)
be the gradient at X_0
. Then f(X_0+h)=f(X_0)+(G,h)+O(|h|^2)
(where (a,b)
is the dot product). Let h=a*G
, and solve A=f(X_0)+a*|G|^2
to get a=(Af(X_0))/|G|^2
(if G=0
, change X_0
) and X_1=X_0+a*G
. If f(X_1)
is close enough to A
, you are done, otherwise proceed to compute f'(X_1)
&c.
If you cannot compute f'
, you can play with many other methods.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.