简体   繁体   中英

Exponential curve fitting without the Curve Fitting toolbox?

I have some data points to which I need to fit an exponential curve of the form

y = B * exp(A/x)

(without the help of Curve Fitting Toolbox ).

What I have tried so far to linearize the model by applying log, which results in

log(y/B) = A/x

log(y) = A/x + log(B)

I can then write it in the form

Y = AX + B

Now, if I neglect B , then I am able to solve it with

A = pseudoinverse (X) * Y

but I am stuck with values of B ...

Fitting a curve of the form

y = b * exp(a / x)

to some data points (xi, yi) in the least-squares sense is difficult. You cannot use linear least-squares for that, because the model parameters ( a and b ) do not appear in an affine manner in the equation. Unless you're ready to use some nonlinear-least-squares method, an alternative approach is to modify the optimization problem so that the modified problem can be solved using linear least squares (this process is sometimes called "data linearization"). Let's do that.

Under the assumption that b and the yi 's be positive, you can apply the natural logarithm to both sides of the equations:

log(y) = log(b) + a / x

or

a / x + log(b) = log(y)

By introducing a new parameter b2 , defined as log(b) , it becomes evident that parameters a and b2 appear in a linear (affine, really) manner in the new equation:

a / x + b2 = log(y)

Therefore, you can compute the optimal values of those parameters using least squares; all you have left to do is construct the right linear system and then solve it using MATLAB's backslash operator:

A = [1 ./ x, ones(size(x))];
B = log(y);
params_ls = A \ B;

(I'm assuming x and y are column vectors, here.)

Then, the optimal values (in the least-squares sense) for the modified problem are given by:

a_ls = params_ls(1);
b_ls = exp(params_ls(2));

Although those values are not, in general, optimal for the original problem, they are often "good enough" in practice. If needed, you can always use them as initial guesses for some iterative nonlinear-least-squares method.

Doing the log transform then using linear regression should do it. Wikipedia has a nice section on how to do this: http://en.wikipedia.org/wiki/Linear_least_squares_%28mathematics%29#The_general_problem

    %MATLAB code for finding the best fit line using least squares method
    x=input('enter a')                      %input in the form of matrix, rows contain points
        a=[1,x(1,1);1,x(2,1);1,x(3,1)]          %forming A of Ax=b
        b=[x(1,2);x(2,2);x(3,2)]                %forming b of Ax=b
        yy=inv(transpose(a)*a)*transpose(a)*b   %computing projection of matrix A on b, giving x
    %plotting the best fit line
      xx=linspace(1,10,50);
      y=yy(1)+yy(2)*xx;
      plot(xx,y)
         %plotting the points(data) for which we found the best fit line
     hold on
        plot(x(2,1),x(2,2),'x')
     hold on
        plot(x(1,1),x(1,2),'x')
     hold on
        plot(x(3,1),x(3,2),'x')
     hold off

I'm sure the code can be cleaned up, but that's the gist of it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM