简体   繁体   English

使用 tensorflow 版本 2.x 从头开始进行 BERT 预训练

[英]BERT pre-training from scratch with tensorflow version 2.x

i used run_pretraining.py ( https://github.com/google-research/bert/blob/master/run_pretraining.py ) python script in tensorflow version 1.15.5 version before.我在tensorflow version 1.15.5中使用run_pretraining.py ( https://github.com/google-research/bert/blob/master/run_pretraining.py ) python 脚本。 I use Google cloud TPU, as well.我也使用谷歌云 TPU。 Is it possible or any python script for BERT pre-training from scratch on TPU using tensorflow version 2.x?是否有可能或任何 python 脚本使用 tensorflow 版本 2.x 在 TPU 上从头开始进行 BERT 预训练?

Yes you can use NPL library from TF2 model garden .是的,您可以使用TF2 model garden中的NPL 库

The instructions for creating training data and running pretraining are here: nlp/docs/train.md#pre-train-a-bert-from-scratch .创建训练数据和运行预训练的说明在这里: nlp/docs/train.md#pre-train-a-bert-from-scratch

You can also follow BERT Fine Tuning with Cloud TPU tutorial with some changes to run pretraining script instead of fine tuning.您还可以按照BERT Fine Tuning with Cloud TPU教程进行一些更改以运行预训练脚本而不是微调。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM