简体   繁体   English

SpaCy 自定义 NER 模型训练中“drop”的含义?

[英]Meaning of "drop" in SpaCy custom NER model training?

Below code is an example training loop for SpaCy's named entity recognition( NER ).下面的代码是 SpaCy 命名实体识别 ( NER ) 的示例训练循环。

for itn in range(100):
    random.shuffle(train_data)
    for raw_text, entity_offsets in train_data:
        doc = nlp.make_doc(raw_text)
        gold = GoldParse(doc, entities=entity_offsets)
        nlp.update([doc], [gold], drop=0.5, sgd=optimizer)
nlp.to_disk("/model")

drop as per spacy is the drop out rate. drop as per spacy是辍学率。 Can somebody explain the meaning of the same in detail?有人可以详细解释相同的含义吗?

According to the documentation here , the SpaCy Entity Recognizer is a neural network that should implement the thinc.neural.Model API.根据此处的文档,SpaCy Entity Recognizer是一个神经网络,应该实现Thinc.neural.Model API。 The drop argument that you are talking about is something called dropout rate which is a way to optimize a neural network.你所说的drop参数是一种叫做dropout rate 的东西,它是一种优化神经网络的方法。

The recommended value is 0.2 based on my experience which means that about 20% of the neurons used in this model will be dropped randomly during training.根据我的经验,推荐值为0.2 ,这意味着该模型中使用的大约 20% 的神经元将在训练过程中随机丢弃。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM