简体   繁体   English

将 WINBUGS 模型转换为 PyMC3

[英]Convert WINBUGS model to PyMC3

I am currently taking a class on Bayesian statistics, we are allowed to use any package to computationally solve the models but all of the examples are provided in WINBUGS.我目前正在学习贝叶斯统计课程,我们可以使用任何包来计算求解模型,但所有示例都在 WINBUGS 中提供。 I would prefer to use Python and PyMC3.我更喜欢使用 Python 和 PyMC3。 I don't have much experience using PyMC3 and could use some help on how to convert this simple WINBUGS model into a PyMC3 model.我没有太多使用 PyMC3 的经验,可以使用一些帮助来了解如何将这个简单的 WINBUGS 模型转换为 PyMC3 模型。

The example WINBUGS code is below.示例 WINBUGS 代码如下。 It is a simple Binomial comparing two options with a different number of observations per sample.这是一个简单的二项式比较两个选项,每个样本的观察数不同。 The model also tests 5 different priors.该模型还测试了 5 个不同的先验。

model{
for(i in 1:5){
   n1[i] <- Tot1   #100
   n2[i] <- Tot2   #3
   y1[i] <- Positives1
   y2[i] <- Positives2
   y1[i] ~ dbin(p1[i],n1[i])
   y2[i] ~ dbin(p2[i],n2[i])
  diffps[i] <- p1[i]-p2[i]   #100seller - 3seller
}

# Uniform priors
   p1[1] ~ dbeta(1, 1); p2[1] ~ dbeta(1, 1)

# Jeffreys'  priors
   p1[2] ~ dbeta(0.5,  0.5);  p2[2] ~ dbeta(0.5, 0.5)

# Informative priors centered at about 93% and 97%
   p1[3] ~ dbeta(30,2);  p2[3] ~ dbeta(2.9,0.1)

# Zellner priors prop to 1/(p * (1-p))
  logit(p1[4]) <- x[1]
    x[1] ~ dunif(-10000, 10000)  # as  dflat()
   logit(p2[4]) <- x[2]
   x[2] ~ dunif(-10000, 10000)   # as dflat()

#Logit centered at 3 gives mean probs close to 95%
  logit(p1[5]) <- x[3]
    x[3] ~  dnorm(3, 1)  
   logit(p2[5]) <- x[4]
    x[4] ~ dnorm(3, 1) 
}

DATA
list(Tot1=100,  Tot2=3, Positives1=95, Positives2=3)

INITS
list(p1=c(0.9, 0.9, 0.9, NA,NA), 
p2=c(0.9, 0.9, 0.9, NA,NA), x=c(0,0,0,0))

list(p1=c(0.5, 0.5, 0.5, NA,NA), 
p2=c(0.5, 0.5, 0.5, NA,NA), x=c(0,0,0,0))

list(p1=c(0.3, 0.3, 0.3, NA,NA), 
p2=c(0.3, 0.3, 0.3, NA,NA), x=c(0,0,0,0))

In PyMC3 I attempted to implement the first of the 5 priors on a single sample (I am not sure how to do both) with the following code:在 PyMC3 中,我尝试使用以下代码在单个样本上实现 5 个先验中的第一个(我不确定如何同时实现):

import np as np
import pymc3 as pm

sample2 = np.ones(3)

with pm.Model() as ebay_example:
    prior = pm.Beta('theta', alpha = 1, beta = 1)
    likelihood = pm.Bernoulli('y', p = prior, observed = sample2)
    trace = pm.sample(1000, tune = 2000, target_accept = 0.95)

The above model ran, but the results don't not align with the BUGS results, I am not sure if it because I didn't do a burn in or some other larger issue.上面的模型运行了,但结果与 BUGS 结果不一致,我不确定是不是因为我没有做老化或其他一些更大的问题。 Any guidance would be great.任何指导都会很棒。

we are taking the same class right now.我们现在在上同一堂课。 Following are my codes and the difference of means is close to BUGS results.以下是我的代码,均值差异接近BUGS结果。

na = 100
nb = 3
pos_a = 95
pos_b = 3

with pm.Model() as model:
    # priors
    p0a = pm.Beta('p0a', 1, 1)

    # likelihood
    obs_a = pm.Binomial("obs_a", n=na, p=p0a, observed=pos_a)

    # sample
    trace1_a = pm.sample(1000)


with pm.Model() as model:
    # priors
    p0b = pm.Beta('p0b', 1, 1)

    # likelihood
    obs_b = pm.Binomial("obs_b", n=nb, p=p0b, observed=pos_b)

    # sample
    trace1_b = pm.sample(1000)
pm.summary(trace1_a)["mean"][0] - pm.summary(trace1_b)["mean"][0]
OUT: 0.1409999999999999

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM