简体   繁体   中英

Calculate Gibbs Sampler (Albert and Chib) for Binary Probit for 2 y vectors

Presently, I am experimenting with this function in R:

rbprobitGibbs(Data, Prior, Mcmc)

When I go through the example in the RStudio help for rbprobitGibbs, it has an n * 1 y vector with this code:

##
## rbprobitGibbs example
##
if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=2000} else {R=10}

set.seed(66)
simbprobit=
function(X,beta) {
##  function to simulate from binary probit including x variable
y=ifelse((X%*%beta+rnorm(nrow(X)))<0,0,1)
list(X=X,y=y,beta=beta)
}

nobs=200
X=cbind(rep(1,nobs),runif(nobs),runif(nobs))
beta=c(0,1,-1)
nvar=ncol(X)
simout=simbprobit(X,beta)

Data1=list(X=simout$X,y=simout$y)
Mcmc1=list(R=R,keep=1)

out=rbprobitGibbs(Data=Data1,Mcmc=Mcmc1)

summary(out$betadraw,tvalues=beta)

if(0){
## plotting example
plot(out$betadraw,tvalues=beta)
}

Could anyone please tell me, how would I update the above code to compute the rbprobitGibbs instead with 2 or really any number of y vectors? The first four columns here are the data from the sample, with the y2 column being the one I added:

X.1 X.2 X.3 y1  y2
1   0.9899365748    0.2189661944    1   1
1   0.8296926252    0.9985278335    0   0
1   0.5858869043    0.4508356177    1   1
1   0.4170461195    0.5318313679    1   0
1   0.6620968801    0.5661155705    0   0
1   0.3793020903    0.4270007173    0   0
1   0.4242656054    0.7121669375    0   0
1   0.8425126909    0.0662683414    1   1
1   0.3764421751    0.5297549635    1   1
1   0.9115300649    0.6154659826    0   1
1   0.2585383623    0.861666956 0   0
1   0.2032627692    0.6300459378    0   0
1   0.2246137869    0.8552307657    0   0
1   0.9146362843    0.7148247214    0   0
1   0.8902309975    0.4851149949    1   1
1   0.2795407828    0.750800706 1   1
1   0.4293785084    0.2013569023    1   1
1   0.2740470199    0.1700141362    0   0
1   0.5124748724    0.3394885478    1   1
1   0.3552741136    0.4214046651    0   0
1   0.1278469148    0.7162302185    0   0
1   0.2943667781    0.5867321899    0   0
1   0.1614354805    0.4957168296    1   1
1   0.5066605383    0.5413606348    1   1
1   0.3777150062    0.4739717268    1   1
1   0.1853615218    0.3201939461    0   0
1   0.0973946755    0.8336658452    0   0
1   0.8443719402    0.5456923875    1   1
1   0.1721531036    0.1616139291    0   0
1   0.0249645228    0.8461988585    1   1
1   0.5729625325    0.0658818937    1   1
1   0.2316438637    0.3755099219    1   1
1   0.3851218582    0.0061001917    0   0
1   0.1498707412    0.1630114091    0   0
1   0.5519695727    0.5340546053    1   1
1   0.278166075 0.4493952126    1   1
1   0.5418767433    0.4024042937    0   0
1   0.9727253027    0.3172571538    1   1
1   0.6974664936    0.9042428946    1   1
1   0.4901860307    0.8092677956    1   1
1   0.4725901203    0.6991247302    1   1
1   0.24298591  0.702485577 1   1
1   0.5262798222    0.3325769177    1   1
1   0.998526135 0.2580267722    1   1
1   0.9664521548    0.1097839777    1   1
1   0.4240157611    0.6493905277    1   1
1   0.2575837581    0.6835418411    1   1
1   0.9943112358    0.0192416003    1   1
1   0.1853664671    0.9032788952    1   1
1   0.758035332 0.5458013031    0   0
1   0.0594366395    0.1449510658    0   0
1   0.1858018977    0.3166872617    1   0
1   0.6369692264    0.3176534963    1   1
1   0.6418922141    0.6724918736    0   0
1   0.6624171205    0.6146978349    0   0
1   0.4099654155    0.4726984168    1   1
1   0.3839501769    0.6608771463    0   0
1   0.4312540782    0.7527925344    0   0
1   0.063719698 0.1763143158    1   1
1   0.3058558635    0.217456545 0   0
1   0.9693015772    0.1797125733    1   1
1   0.1807862343    0.2376139732    0   1
1   0.8151498642    0.0461780084    1   1
1   0.6328742586    0.35979084  1   1
1   0.3616710461    0.969523683 0   0
1   0.281503205 0.7784934922    1   1
1   0.7212781536    0.7413479725    0   0
1   0.092384431 0.5874917486    0   0
1   0.155009425 0.8415817369    0   0
1   0.8922143609    0.6630925562    1   1
1   0.991042807 0.0252894   1   1
1   0.4425945459    0.8692532447    0   0
1   0.3441495837    0.9671099493    1   1
1   0.1620598785    0.3095594603    1   1
1   0.8981178766    0.8030338185    1   1
1   0.2345796963    0.4723579886    0   0
1   0.5316903049    0.510752735 1   1
1   0.7979545509    0.330074837 1   1
1   0.2654725285    0.205431531 1   1
1   0.3717567755    0.6099570901    1   1
1   0.2746311817    0.5846611073    1   1
1   0.6063400249    0.5772228637    0   0
1   0.0473926074    0.2661940577    1   1
1   0.6541800608    0.9317179173    1   1
1   0.9319066966    0.6864310182    0   0
1   0.6435363619    0.6030408067    1   1
1   0.2995573319    0.9094474597    1   1
1   0.171410341 0.437170333 0   0
1   0.6401165337    0.3053631645    1   1
1   0.3017032004    0.0369891548    0   0
1   0.3860893343    0.6809475985    0   0
1   0.4636157434    0.2839248523    0   0
1   0.1536822913    0.8689155467    0   0
1   0.6962356898    0.4797217257    1   1
1   0.4657603616    0.4761086726    1   1
1   0.2734947701    0.3373264791    0   1
1   0.0445950362    0.1672765177    0   0
1   0.5687612037    0.4581195179    0   0
1   0.4707629299    0.4836922633    0   0
1   0.751863339 0.2289583082    1   1
1   0.259972115 0.4522923154    0   0
1   0.710883829 0.531755195 0   0
1   0.328196065 0.3619224844    0   0
1   0.3967730084    0.0183546853    0   0
1   0.215346405 0.7562066903    0   0
1   0.1864551497    0.5278709074    0   0
1   0.9422974272    0.6206218491    1   1
1   0.6341338102    0.3694500523    1   1
1   0.6171285694    0.4848789659    1   0
1   0.5353122416    0.1012676626    1   1
1   0.9107316111    0.4414798988    0   0
1   0.364191507 0.5692639218    0   0
1   0.7192769244    0.871085976 0   0
1   0.5625851532    0.6071606656    0   0
1   0.6643338243    0.3915698901    1   1
1   0.1365202225    0.8386956837    1   1
1   0.1951833158    0.6553574409    0   0
1   0.3314071957    0.5843182474    0   0
1   0.94224557  0.3040650734    1   1
1   0.0419700646    0.7691562055    0   0
1   0.43872105  0.0486131848    0   0
1   0.5100067658    0.6341659653    1   1
1   0.1222995215    0.5880768083    0   0
1   0.1578365152    0.6505905646    1   1
1   0.3995162598    0.8664736878    1   1
1   0.947365944 0.2596085703    0   0
1   0.5945339077    0.4999730913    1   1
1   0.6945818942    0.5181555478    1   1
1   0.1378006986    0.7515407458    0   0
1   0.2357496801    0.6892036018    1   1
1   0.333463666 0.3506809592    1   1
1   0.7776793852    0.9664791177    1   1
1   0.254368332 0.3952343608    1   0
1   0.3824195687    0.4121196822    1   1
1   0.1427211207    0.0237048781    1   1
1   0.7924600174    0.8477656154    1   1
1   0.3051217012    0.5423933757    0   0
1   0.4072035847    0.612145344 1   1
1   0.0192040021    0.3364684819    1   1
1   0.7076993815    0.3081624294    1   1
1   0.3519693122    0.4135619369    1   1
1   0.3202238996    0.4520605011    1   1
1   0.2489268424    0.2608512486    0   0
1   0.1009442504    0.7335172631    0   1
1   0.6263556832    0.6289500771    0   0
1   0.7245263092    0.6750859108    1   1
1   0.8317463186    0.8096570096    0   0
1   0.2434659142    0.9338806737    0   0
1   0.4874795836    0.3839077372    0   0
1   0.6801484544    0.1946305826    1   1
1   0.643040627 0.7182156285    1   1
1   0.5217843335    0.7021823346    1   1
1   0.9625178338    0.9834434157    1   1
1   0.6535910789    0.5019235369    0   0
1   0.94050515  0.7863713149    1   1
1   0.8751446512    0.1334866604    1   1
1   0.8634884465    0.3494574034    0   0
1   0.91536376  0.2091044907    1   1
1   0.1773750747    0.652113735 0   0
1   0.225825988 0.125594883 1   1
1   0.9055066747    0.4528328346    0   0
1   0.7427957433    0.924980263 0   0
1   0.4384290115    0.3252121131    1   1
1   0.255896548 0.3704734396    1   1
1   0.0225097234    0.5561372044    0   0
1   0.6844344195    0.4750160228    1   1
1   0.1892332132    0.9139610652    0   0
1   0.2052155717    0.7026022857    0   0
1   0.7506253854    0.700289489 1   1
1   0.1649265396    0.1530481223    1   1
1   0.5506814695    0.7213967661    0   1
1   0.2912465036    0.7983302297    1   1
1   0.7702643026    0.527433767 1   1
1   0.9710474953    0.8381176256    0   0
1   0.8290705713    0.971519822 1   1
1   0.632064248 0.6417001493    1   1
1   0.4600081004    0.1881597352    1   1
1   0.7718191738    0.8489826284    0   0
1   0.8830631888    0.638571579 0   0
1   0.2392087474    0.1479197666    1   1
1   0.6529120326    0.3340752197    1   1
1   0.4075479258    0.746168833 0   0
1   0.1362040308    0.127894334 0   0
1   0.5595724836    0.9276634019    1   1
1   0.9293997863    0.3116247791    1   1
1   0.0010047755    0.151142987 1   1
1   0.8327829326    0.3474256487    1   1
1   0.4180099745    0.8976569436    0   0
1   0.6827635004    0.8410718208    0   0
1   0.2065993831    0.5254033669    1   1
1   0.6488550527    0.3361348431    0   0
1   0.960010844 0.4958175228    0   0
1   0.8394188522    0.7944452723    1   1
1   0.5054272702    0.4813854268    0   0
1   0.9041459351    0.1566028744    1   1
1   0.6693580367    0.0554579678    1   1
1   0.3972949465    0.6975388743    0   0
1   0.7972005522    0.7535337424    0   1
1   0.8931960308    0.5674132451    0   0
1   0.1921383152    0.055920549 1   1   

UPDATE:

FYI, this is the book I am working through:

http://ca.wiley.com/WileyCDA/WileyTitle/productCd-0470863676.html

Actually, I found out I was working with the wrong bayesm function. Instead, I should be using this for a multivariate probit model which handles multiple y vectors:

rmvpGibbs - implements the Edwards/Allenby Gibbs Sampler for the multivariate probit model

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM