简体   繁体   中英

GEKKO Python - simple linear regression from Documentation

By following GEKKO documentation I used the example for the linear and polynomial regression. Here is just the part regarding the simple linear regression.

from gekko import GEKKO
import numpy as np

xm = np.array([0.1,0.3,0.2,0.5,0.8])
ym = np.array([0.52,0.53,0.4,0.6,1.01])
             
#### Solution
m = GEKKO()
m.options.IMODE=2
# coefficients
c = [m.FV(value=0) for i in range(2)]
c[0].STATUS=1
c[1].STATUS=1

x = m.Param(value=xm)
yd = m.Param(value=ym)

y = m.CV(value=ym)
y.FSTATUS = 1

######### uncomment ############
#y = m.Var()
#m.Minimize((y-yd)**2)
################################
# polynomial model

m.Equation(y==c[0]+c[1]*x)
# linear regression
m.solve(disp=False)
p1 = [c[1].value[0],c[0].value[0]]
p1

I just wonder why the different results are obtained when uncommenting the lines

y = m.Var()
m.Minimize((y-yd)**2)

It seems that the results obtained (linear, quadratic, cubic) in the documentation are not the least squares ones. What the minimizing criteria was used in those cases?

Best Regards, Radovan

Switch to m.options.EV_TYPE=2 to get a squared error objective .

The default in Gekko is the l1-norm objective. Here is a description of the differences:

目标函数

The l1-norm objective (sum of the absolute error) is less sensitive to outliers and bad data.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM