Code:
from scipy.optimize import curve_fit
import numpy as np
from numpy import *
def func(x, a, b):
return ff(x,a,b)
ff= lambda x,a,b: eval("1/(a*x+b)")
xdata = [1 ,2, 4, 6, 8, 10]
ydata = [0.22, 0.1, 0.06, 0.04, 0.03, 0.024]
popt, pcov = curve_fit(func, xdata, ydata)
print('\n', '[a b] for the best fit = ', popt,'\n')
when this runs it gives
[a b] for the best fit = [ 4.62673137 -0.04794652]
Meanwhile according to my scientific calculator (or by solving this manually), the answer should be:
[a b] for the best fit = [ 0.9232 4.05396]
I tested the program repeatedly, this isn't the only example in which it doesn't provide correct results.
I've checked you calculations:
import matplotlib.pyplot as plt
plt.scatter(xdata, ydata)
plt.scatter(xdata, func(np.array(xdata), 4.62673137, -0.04794652), c='r')
plt.scatter(xdata, func(np.array(xdata), 0.9232, 4.05396), c='g')
plt.show()
and found that curve_fit()
gave good well-fitting parameters. The problem is with your other solution.
Don't use from X import *
. It makes code extremely hard to manage in some cases.
Don't use eval()
if you don't need it. In this case:
def func(x, a, b):
return 1. / (a * x + b)
is shorter and clearer.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.