I created power regression equations for length vs dry mass in both R and Excel but the coefficients do not match.
I used Hong Ooi's answer from this link: Power regression in R similar to excel . In that code, they were able to replicate the power equation from Excel using the R code. However, when I tried, I got some very weird coefficients. The Excel equation from the power trendline is much more accurate when tested with random lengths.
Code as follows:
#sample dataset of Lengths and Dry Masses
test <- structure(list(
Length = c(23, 17, 16, 25, 15, 25, 11, 22, 13, 21, 31),
DryMass = c(3.009, 1.6, 1, 4.177, 0.992, 6.166, 0.7, 1.73, 0.613, 3.429, 7.896)),
.Names = c("Length", "DryMass"),
row.names = c(NA, 11L),
class = "data.frame")
#log-log regression
lm(formula = log(Length) ~ log(DryMass), data = test)
Coefficients:
(Intercept) log(DryMass)
2.7048 0.3413
This should give me the equation "14.9515*x^0.3413" once I convert the intercept (EXP(2.7048) = 14.9515). I attempted to test it with some random lengths and the predictions are way off.
However, the equation given by Excel is "0.0009*x^2.6291" which, when tested, was very accurate. I would just use the equation from Excel but I need to make 50 more of these and would love to automate it using R.
You are trying to fit the following model.
library(ggplot2)
ggplot(test, aes(x = log(DryMass), y = log(Length))) +
theme_bw() +
geom_point() +
scale_y_continuous(limits = c(0, 5)) +
geom_smooth(formula = y ~ x, method = "lm", se = FALSE)
(Intercept) (first coefficient) is where the line crosses y at x=0, I believe. On the image above, this appears to be between 2.5 and 3, so let's say 2.8, which is pretty darn close to 2.7 if you ask me. Maybe Excel is wrong, in which case, I suggest you contact their authors? Or maybe you're doing something in Excel which is not being heard here for which reason it could be said about reproducibility of the said tool.
Edit:
You switched x and y in R.
mod_linearized <- lm(formula = log(DryMass) ~ log(Length), data = test)
exp(coef(mod_linearized)[1])
# (Intercept)
#0.0008775079
Old answer (which might still be useful):
The back-transformation of the linearized model is not the same as the non-linear model because error terms are different:
Back-transformed linearized model results in multiplicative error: y = exp(a) * x ^ b * exp(epsilon)
Non-linear model has an additive error: y = a * x ^ b + epsilon
Basically, the linearization is equivalent to a different weighting of data points (larger values are weighted less). That can actually be desirable (depending on your specific data generating process). But sometimes you want equal weight and then you should fit the non-linear model.
You can do non-linear regression in R:
mod_linearized <- lm(formula = log(Length) ~ log(DryMass), data = test)
exp(coef(mod_linearized)[1])
#(Intercept)
# 14.95152
mod_nonlinear <- nls(Length ~ a * DryMass ^ b, data = test,
#use result from linearization as starting values:
start = list(a = exp(coef(mod_linearized)[1]),
b = coef(mod_linearized)[2]))
coef(mod_nonlinear)[1]
# a
#15.2588
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.