简体   繁体   English

带有参数调整的 Python 包装器围绕快速文本训练

[英]Python wrapper arround fasttext train with parameter tuning

I use Fasttext to do classification of toxic comments (the Kaggle competition).我使用 Fasttext 对有毒评论进行分类(Kaggle 竞赛)。 To train my model I run the command为了训练我的模型,我运行命令

fasttext supervised -input model_train.train -output model_tune -autotune-validation model_train.valid -autotune-modelsize 100M -autotune-duration 1200

which train a classification model and do parameters tuning while ensuring the size of the model is below 100M.训练分类模型并进行参数调优,同时保证模型大小在100M以下。 Is there a python wrapper to train supervised model with -autotune-validation ?是否有 python 包装器可以使用-autotune-validation训练监督模型? I know there is python wrapper for the predict and train method but couldn't find one to train classification models with autotune-validation .我知道有用于predicttrain方法的 python 包装器,但找不到使用autotune-validation训练分类模型的包装器。 Also if on the top of that there is a sklearn wrapper that does the same thing that would be marvelous.此外,如果最重要的是有一个 sklearn 包装器,它可以做同样的事情,那将是非常棒的。

Thanks in advance提前致谢

Yes, you can autotune it using Python by adding autotuneValidationFile parameter to the function.是的,您可以通过向函数添加autotuneValidationFile参数来使用 Python 自动调整它。

Ref: https://fasttext.cc/docs/en/autotune.html参考: https : //fasttext.cc/docs/en/autotune.html

As explained here , python wrapper for fastText automatic hyperparameter optimization has the following syntax:正如解释在这里,对于fastText自动超参数优化Python包装的语法如下:

model_tune = fasttext.train_supervised (input='model_train.train', \
  autotuneValidationFile='model_train.valid', autotuneModelSize='100M', \
  autotuneDuration=1200)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM