簡體   English   中英

R:為什么我可以用 lavaan::sem() 估計兩個變量之間的回歸和協方差?

[英]R: Why can I estimate both a regression AND covariance between two variables with lavaan::sem()?

我的理解是,不能在 lavaan 中的兩個變量之間同時使用回歸和協方差公式。 不過,我能夠估計以下模型。 這是語法、標識或錯誤的問題嗎?

library(lavaan)

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Specify model
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

mod1 <- '
# Measurement model
# - 2 factors {rs, wr} at 2 time points {1,2}, 4 indicators apiece
# - fix first loading to 1 for identification
# - constrain factor loadings to be equal across time
r1 =~ 1*x1_1 + x1_2 + x1_3 + x1_4
r2 =~ 1*x2_1 + x2_2 + x2_3 + x2_4
w1 =~ 1*x1_5 + x1_6 + x1_7 + x1_8
w2 =~ 1*x2_5 + x2_6 + x2_7 + x2_8

# Estimate latent factor variances
# - only bc model is identified via fixed 1st loading
r1 ~~ NA*r1
r2 ~~ NA*r2
w1 ~~ NA*w1
w2 ~~ NA*w2

# Estimate covariance between contemporaneously-measured latent factors
r1 ~~ w1
r2 ~~ w2

# Regressions
r2 ~ r1
w2 ~ w1

# Estimate covariance between factors across time
# NOTE: unclear why I can estimate these parameters given regression formulas
r2 ~~ r1
w2 ~~ w1
'
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Create sample covariance matrix
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

# Variable names
vars <- c("x1_1","x1_2","x1_3","x1_4","x1_5","x1_6","x1_7","x1_8",
           "x2_1","x2_2","x2_3","x2_4","x2_5","x2_6","x2_7","x2_8")

# Create matrix
sample_cov <- matrix(data = c(0.999833293078379,0.487956444582057,0.483664056467545,0.400998426513142,0.491202933952403,
                0.353931449316117,0.285944906976342,0.259668866614031,0.597588804705785,0.389983101571606,
                0.389986282460954,0.352688728884064,0.413556920977731,0.310728635134553,0.265334745911224,
                0.275968834910899,0.487956444582057,0.987898167268851,0.597282474952359,0.624490928441204,
                0.247125607027116,0.329128073457,0.292935925099769,0.248056475192883,0.410618949682518,
                0.491198808670857,0.428587638131653,0.415870887874728,0.263648376842619,0.335405511090691,
                0.299434930929212,0.303516673953949,0.483664056467545,0.597282474952359,0.987591489156997,
                0.70697049346826,0.273060714265672,0.32252711049228,0.476538022724238,0.405092823248217,
                0.383809997408897,0.420133217722261,0.532069928685633,0.472834561550818,0.281130854462545,
                0.320879117346738,0.394828887309731,0.367727058341432,0.400998426513142,0.624490928441204,
                0.70697049346826,1.01074290449809,0.196256330906052,0.263382784920277,0.375529025588328,
                0.416310437347982,0.323177010703894,0.396708104792945,0.448962585030441,0.485905309303198,
                0.214011581764438,0.286993920549238,0.324133600586384,0.376046637537441,0.491202933952403,
                0.247125607027116,0.273060714265672,0.196256330906052,0.998610835001528,0.573990161485373,
                0.442329722010083,0.40108886269627,0.344987740402206,0.22482827808808,0.216597438677272,
                0.155839106657515,0.497912057832072,0.350927415574806,0.287668780643262,0.256277533580356,
                0.353931449316117,0.329128073457,0.32252711049228,0.263382784920277,0.573990161485373,
                1.00444245494004,0.666026932163401,0.607357662014435,0.248469574577396,0.288661477004284,
                0.288770354292082,0.236124802603109,0.391592159472882,0.530091538181447,0.416582877505298,
                0.407030707616969,0.285944906976342,0.292935925099769,0.476538022724238,0.375529025588328,
                0.442329722010083,0.666026932163401,1.00046601549333,0.782931683531597,0.213627568752322,
                0.283215107581028,0.366472357153765,0.288307993754945,0.326451332805336,0.432225371555803,
                0.537107909732078,0.507474265680886,0.259668866614031,0.248056475192883,0.405092823248217,
                0.416310437347982,0.40108886269627,0.607357662014435,0.782931683531597,1.00617979684262,
                0.223617186711885,0.264438350331771,0.339792125173085,0.355808519681323,0.284308372166869,
                0.401670251542325,0.475918180445249,0.591423836570424,0.597588804705785,0.410618949682518,
                0.383809997408897,0.323177010703894,0.344987740402206,0.248469574577396,0.213627568752322,
                0.223617186711885,0.989779924090633,0.653160349325124,0.63806316639147,0.594152250976385,
                0.467091528287757,0.395182555441065,0.368953579075386,0.37879303032014,0.389983101571606,
                0.491198808670857,0.420133217722261,0.396708104792945,0.22482827808808,0.288661477004284,
                0.283215107581028,0.264438350331771,0.653160349325124,0.991757099571506,0.762963568718599,
                0.762480884806328,0.384610284210552,0.50670524190018,0.487150198379972,0.477553716549035,
                0.389986282460954,0.428587638131653,0.532069928685633,0.448962585030441,0.216597438677272,
                0.288770354292082,0.366472357153765,0.339792125173085,0.63806316639147,0.762963568718599,
                1.00053469496961,0.822834292300233,0.374705998031281,0.501332849270603,0.573598299835578,
                0.543045239225744,0.352688728884064,0.415870887874728,0.472834561550818,0.485905309303198,
                0.155839106657515,0.236124802603109,0.288307993754945,0.355808519681323,0.594152250976385,
                0.762480884806328,0.822834292300233,1.00022307320538,0.335192622088254,0.462799373720387,
                0.543255621639804,0.601852486639232,0.413556920977731,0.263648376842619,0.281130854462545,
                0.214011581764438,0.497912057832072,0.391592159472882,0.326451332805336,0.284308372166869,
                0.467091528287757,0.384610284210552,0.374705998031281,0.335192622088254,0.990336396260981,
                0.649930931016302,0.548528520259829,0.48836742442375,0.310728635134553,0.335405511090691,
                0.320879117346738,0.286993920549238,0.350927415574806,0.530091538181447,0.432225371555803,
                0.401670251542325,0.395182555441065,0.50670524190018,0.501332849270603,0.462799373720387,
                0.649930931016302,0.997339136613221,0.733410325563141,0.685756835860867,0.265334745911224,
                0.299434930929212,0.394828887309731,0.324133600586384,0.287668780643262,0.416582877505298,
                0.537107909732078,0.475918180445249,0.368953579075386,0.487150198379972,0.573598299835578,
                0.543255621639804,0.548528520259829,0.733410325563141,0.994147669703169,0.811982851534474,
                0.275968834910899,0.303516673953949,0.367727058341432,0.376046637537441,0.256277533580356,
                0.407030707616969,0.507474265680886,0.591423836570424,0.37879303032014,0.477553716549035,
                0.543045239225744,0.601852486639232,0.48836742442375,0.685756835860867,0.811982851534474,
                0.996970290203717),
       nrow = length(vars),
       ncol = length(vars),
       dimnames = list(vars, vars))

#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Fit model with sample covariance matrix
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
set.seed(123)

mod_fit <- sem(sample.cov = sample_cov,
                  sample.nobs = 969,
                  model = mod1)
 
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Summarize parameters
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
summary(mod_fit)

經過一番探索,我發現這是由於錯誤收斂造成的。 雖然不同的起始值、迭代次數、優化器等復制了這個問題,但將收斂閾值設置得稍微低一點就可以適當地告訴我們模型未能收斂。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM