I have the following code that I managed to piece together:
private void CalcMSE(List<Point> data)
{
double sum = 0.0;
foreach (Point item in data)
{
double difference = item.m_x - item.m_y;
sum = sum + difference * difference;
}
double mse = sum / x; //<-- Don't know what x should be!
Console.WriteLine("The mean square error is {0}",mse) ;
}
the inputs are :
point.m_x = 3;
point.m_y = 1;
pointList.Add(point);
point.m_x = 4;
point.m_y = 4;
pointList.Add(point);
point.m_x = 5;
point.m_y = 6;
pointList.Add(point);
point.m_x = 6;
point.m_y = 6;
pointList.Add(point);
point.m_x = 8;
point.m_y = 10;
pointList.Add(point);
According to those in the know the MSE should be 0.77 and MSR 40.89
But I have no idea what the "formula" for x is (see code comment).
Can anyone assist me who knows anything about linear regression?
here's what the ouputs should be
First of all, I think you should always create a new point before adding to the list point = new Point(x,y)
, I don't think list.add creates a copy.
as for the x, I think data.Count()
will do
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.