简体   繁体   English

最小二乘回归模型

[英]least square regression model

I am wondering if someone can help me to understand the what is behind Approx and approxfun. 我想知道是否有人可以帮助我了解Approx和roxfun背后的含义。 I know that these two functions perform a linear interpolation, however I didn't find any reference on how they do that. 我知道这两个函数执行线性插值,但是我没有找到关于它们如何实现的参考。 I guess they use a least square regression model but I am not sure. 我猜他们使用最小二乘回归模型,但是我不确定。

Finally, if it's true that they use a least square regression model what is the difference between them and lm + predict? 最后,如果确实使用最小二乘回归模型,则它们与lm +预测之间有什么区别?

As commented , you should read the source code. 如前所述,您应该阅读源代码。 Interpolation problem 插值问题

Find y(v), given (x,y)[i], i = 0,..,n-1 */ 给定(x,y)[i],找到y(v),i = 0,..,n-1 * /

For example approxfun use a simple this algorithm for linear approximation : 例如, approxfun使用简单的this算法进行线性逼近:

  1. y(v), given (x,y)[i], i = 0,..,n-1 */ y(v),给定(x,y)[i],i = 0,..,n-1 * /
  2. find the correct interval (i,j) by bisection */ 通过二等分找到正确的间隔(i,j)* /
  3. Use i,j for linear interpolation 使用i,j进行线性插值

Here an R code that aprahrase the C function approx1 : 这是一个暂停C函数约1的R代码:

approx1 <- 
  function( v, x, y)
{
  ## Approximate  y(v),  given (x,y)[i], i = 0,..,n-1 */


  i <- 1
  j <- length(x) 
  ij <- 0

  ## find the correct interval by bisection */
    while(i < (j-1) ) { 
         ij <- floor((i + j)/2)
         if(v < x[ij]) 
             j <- ij 
         else 
           i <- ij
    }
  ## linear interpolation */

    if(v == x[j]) return(y[j])
    if(v == x[i]) return(y[i])

    return (y[i] + (y[j] - y[i]) * ((v - x[i])/(x[j] - x[i])))
  }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM