简体   繁体   English

GEKKO Python - 来自文档的简单线性回归

[英]GEKKO Python - simple linear regression from Documentation

By following GEKKO documentation I used the example for the linear and polynomial regression.通过遵循 GEKKO 文档,我使用了线性和多项式回归的示例。 Here is just the part regarding the simple linear regression.这只是关于简单线性回归的部分。

from gekko import GEKKO
import numpy as np

xm = np.array([0.1,0.3,0.2,0.5,0.8])
ym = np.array([0.52,0.53,0.4,0.6,1.01])
             
#### Solution
m = GEKKO()
m.options.IMODE=2
# coefficients
c = [m.FV(value=0) for i in range(2)]
c[0].STATUS=1
c[1].STATUS=1

x = m.Param(value=xm)
yd = m.Param(value=ym)

y = m.CV(value=ym)
y.FSTATUS = 1

######### uncomment ############
#y = m.Var()
#m.Minimize((y-yd)**2)
################################
# polynomial model

m.Equation(y==c[0]+c[1]*x)
# linear regression
m.solve(disp=False)
p1 = [c[1].value[0],c[0].value[0]]
p1

I just wonder why the different results are obtained when uncommenting the lines我只是想知道为什么取消注释行时会得到不同的结果

y = m.Var()
m.Minimize((y-yd)**2)

It seems that the results obtained (linear, quadratic, cubic) in the documentation are not the least squares ones.文档中获得的结果(线性、二次、三次)似乎不是最小二乘法。 What the minimizing criteria was used in those cases?在这些情况下使用了哪些最小化标准?

Best Regards, Radovan最好的问候, 拉多万

Switch to m.options.EV_TYPE=2 to get a squared error objective .切换到m.options.EV_TYPE=2以获得平方误差目标

The default in Gekko is the l1-norm objective. Gekko 中的默认值是 l1-norm 目标。 Here is a description of the differences:以下是对差异的描述

目标函数

The l1-norm objective (sum of the absolute error) is less sensitive to outliers and bad data. l1-norm 目标(绝对误差之和)对异常值和不良数据不太敏感。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM