简体   繁体   English

如何正确使用tensorflow_probability从随机变量的函数中采样?

[英]How to correctly use tensorflow_probability to sampling from random variables' function?

I`m interested in the features of bijectors in tensorflow_probability, so I tried to sampling from a random variable function which is constructed by tfp.bijectors. 我对tensorflow_probability中的Bijector的特性很感兴趣,因此我尝试从tfp.bijectors构造的随机变量函数中采样。

I just provide my test code blow, and here I privide some detials: the case I used to tested is the Chi_square distribution. 我只是提供测试代码的打击,在这里我详细说明了一些:我以前测试过的情况是Chi_square分布。 I got the samples out Chi(2) distribution in two different ways: (1)directly use the Chi(2) api in tensorflow; 我以两种不同的方式从Chi(2)分布中获取样本:(1)在张量流中直接使用Chi(2)api; (2)using the tfp.bijectors by the relationship between Chi(2) and standerd normal distribution( N(0, 1) ): if X, Y iid~N(0,1), Z = g(X, Y) = X^2 + Y^2, then Z ~ Chi(2). (2)通过Chi(2)与标准正态分布(N(0,1))之间的关系使用tfp.bijectors:如果X,Y iid〜N(0,1),Z = g(X,Y) = X ^ 2 + Y ^ 2,则Z〜Chi(2)。 My result showed blow, the means of tow groups samples is approximately equal, but the tow standard deviation is much more different, anyone could tell me where i`m wrong and how to use the tf_probability correctly? 我的结果显示出打击,拖曳组样本的均值大致相等,但拖曳标准偏差却大不相同,任何人都可以告诉我我错了哪里以及如何正确使用tf_probability?

import os

import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
import tensorflow_probability as tfp
from scipy.stats import chi2

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

tf.reset_default_graph()   # Clear computational graph before calc again!!!

tfd = tfp.distributions
tfb = tfp.bijectors

n_samples = 2000

chi2_origin = tfd.Chi2(2)
s_chi2_origin = chi2_origin.sample([n_samples])

base_normal = tfd.Normal(loc=0., scale=1.)
n_to_chi1_bij = tfb.Square()
n_to_chi2_bij = tfb.Chain([tfb.AffineScalar(shift=0., scale=2.), tfb.Square()])

target_Chi = tfd.TransformedDistribution(
    distribution=base_normal,
    bijector=n_to_chi2_bij,
    name="Chi_x_constructed"
)
s_chi1_constru = target_Chi.sample([n_samples])

with tf.Session() as sess:
    init_op = tf.local_variables_initializer()
    sess.run(init_op)
    s_chi2_origin_ = sess.run(s_chi2_origin)
    # print("Samples by Chi2_ORIGIN", s_chi2_origin_)
    print("Origin :  mean={:.4f}, std={:.4f}".
          format(s_chi2_origin_.mean(), s_chi2_origin_.std()))

    s_chi2_constru_ = sess.run(s_chi1_constru)
    # print("Samples by Chi1_CONSTRU:", s_chi1_constru_[-5:-1])
    print("Constru:  mean={:.4f}, std={:.4f}".
          format(s_chi2_constru_.mean(), s_chi2_constru_.std()))

x = np.arange(0, 15, .5)
y = chi2(2).pdf(x)

fig, (ax0, ax1) = plt.subplots(1, 2, sharey=True, figsize=(6,4))
ax0.hist(s_chi2_origin_, bins='auto', density=True)
ax0.plot(x, y, 'r-')
ax1.hist(s_chi2_constru_, bins=200, density=True)
ax1.plot(x, y, 'r-')
plt.show()

And here is my result, the Origin line is calculated by directly Chi(2) api in tf, the left image shows the origin result; 这是我的结果,原点行是直接由tf中的Chi(2)api计算的,左图显示了原点结果; the constru line and the right picture is gotten by tf_probability.bijectors. constru行和正确的图片由tf_probability.bijectors获得。

在此处输入图片说明

I think you have the entries in your chain reversed. 我认为您的链条中的条目已经颠倒了。 They are called right to left, as in the mathematical notation of function composition. 如功能组合的数学表示法一样,它们被称为从右到左。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM