简体   繁体   English

导出 Tensorflow 概率的隐马尔可夫 Model

[英]Exporting Tensorflow probability's Hidden Markov Model

I want to export HMM model because training it every time takes time.我想导出 HMM model 因为每次训练它都需要时间。 My method is to save all matrices in file.我的方法是将所有矩阵保存在文件中。 I want to know is there any tensorflow way I can do it?我想知道有什么 tensorflow 方法可以做到吗? Also is it possible to export it with api to other languages like C++.也可以将它与 api 一起导出到其他语言,如 C++。

you can iterate over and save the weights from the model variables by calling variables attribute of tfp.distributions.HiddenMarkovModel()您可以通过调用tfp.distributions.HiddenMarkovModel()variables属性来迭代并保存 model 变量的权重

tf.saved_model would be the recommended way to do this. tf.saved_model将是执行此操作的推荐方法。 Something like:就像是:

import tensorflow as tf
import tensorflow_probability as tfp

hmm = tfp.distributions.HiddenMarkovModel(
    initial_distribution=tfp.distributions.Categorical(logits=tf.Variable([0., 0])),
    transition_distribution=tfp.distributions.Categorical(logits=tf.Variable([[0., 0]] * 2)),
    observation_distribution=tfp.distributions.Normal(tf.Variable([0., 0]), 
                                                      tfp.util.TransformedVariable([1., 1], tfp.bijectors.Softplus(low=1e-3))),
    num_steps=10)

x = hmm.sample(100)

opt = tf.optimizers.Adam(0.01)

@tf.function
def one_step():
  with tf.GradientTape() as t:
    nll = -hmm.log_prob(x)
  grads = t.gradient(nll, hmm.trainable_variables)
  opt.apply_gradients(zip(grads, hmm.trainable_variables))

for _ in range(10):
  one_step()

class Foo(tf.Module):
  def __init__(self, hmm):
    self._hmm = hmm
  @tf.function(input_signature=[tf.TensorSpec.from_tensor(x)])
  def log_prob(self, x):
    return self._hmm.log_prob(x)

tf.saved_model.save(Foo(hmm), '/tmp/tf.model')
q = tf.saved_model.load('/tmp/tf.model')
q.log_prob(x)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM