[英]Training RNN on GPU - which tf.keras layer should I use?
I am training RNNs, which I built using tf.keras.layers.GRU layers. 我正在训练使用tf.keras.layers.GRU图层构建的RNN。 They are taking a long time to train (>2 hours), so I am going to deploy them to the GPU for training.
他们需要很长时间来训练(> 2个小时),所以我将把它们部署到GPU上进行训练。 I am wondering a few things about training on GPU:
我想知道有关GPU训练的一些事情:
tf.keras.layers.CuDNNGRU
and tf.keras.layers.GRU
(and also tf.keras.layers.LSTM
vs. tf.keras.layers.CuDNNLSTM
)? tf.keras.layers.CuDNNGRU
和tf.keras.layers.GRU
(以及tf.keras.layers.LSTM
和tf.keras.layers.CuDNNLSTM
)之间有什么区别? I understand from this post that CuDNNGRU
layers train faster than GRU
layers, but CuDNNGRU
, CuDNNGRU
层的训练速度比GRU
层快,但是
coremlconverter
to convert my keras model to CoreML for deployment. coremlconverter
将我的keras模型转换为CoreML进行部署。 tf.keras.layers.SimpleRNN
(ie tf.keras.layers.CuDNNSimpleRNN
)? tf.keras.layers.SimpleRNN
(即tf.keras.layers.CuDNNSimpleRNN
)? I am not committed to a specific architecture yet, and so I believe I would need the tf.keras.layers.CuDNNSimpleRNN
layer if I decide on SimpleRNNs and the CuDNN layer has some functionality that I need. tf.keras.layers.CuDNNSimpleRNN
层。 CuDNN
layers, do I need to have tensorflow-gpu
installed? CuDNN
层时,我需要安装tensorflow-gpu
吗? Or do they still get deployed to the GPU as long as I have the relevant drivers installed? if you are using a cuda compatible gpu, it makes absolutely sense to use CuDNN layers. 如果您使用的是与cuda兼容的GPU,则使用CuDNN图层绝对有意义。 They have a different implementation that tries to overcome computation parallelization issues inherent in the RNN architecture.
它们具有不同的实现,试图克服RNN体系结构中固有的计算并行化问题。 They usually perform a bit worst though but are 3x-6x faster https://twitter.com/fchollet/status/918170264608817152?lang=en
尽管它们通常表现较差,但是速度要快3到6倍https://twitter.com/fchollet/status/918170264608817152?lang=en
Do the 2 layers converge to different results with the same seed?
使用相同的种子,这两个层会收敛到不同的结果吗?
yes 是
Do the 2 layers perform the same during inference?
这两个层在推理过程中是否执行相同的操作?
You should have a comparable performance but not exactly the same 您应该具有可比的性能,但不完全相同
Do CuDNN layers require a GPU during inference?
CuDNN层在推理期间是否需要GPU?
Yes but you can convert to a CuDNN compatible GRU/LSTM 是的,但是您可以转换为与CuDNN兼容的GRU / LSTM
Can GRU layers run inference on a GPU?
GRU层可以在GPU上运行推理吗?
Yes 是
With CuDNN layers, do I need to have tensorflow-gpu installed?
使用CuDNN层时,我需要安装tensorflow-gpu吗? Or do they still get deployed to the GPU as long as I have the relevant drivers installed?
还是只要我安装了相关的驱动程序,它们是否仍会部署到GPU?
Yes and you need a cuda compatible gpu 是的,您需要兼容cuda的GPU
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.