简体   繁体   English

用一个任务训练一个 model 并用另一个任务测试它?

[英]Train a model with a task and test it with another task?

I have a data-frame consists of 3000 samples, n numbers of features, and two targets columns as follow:我的数据框由 3000 个样本、n 个特征和两个目标列组成,如下所示:

mydata:
       id,   f1, f2, ..., fn, target1, target2
       01,   23, 32, ..., 44,   0    ,  1
       02,   10, 52, ..., 11,   1    ,  2
       03,   66, 15, ..., 65,   1    ,  0
                     ...
       2000, 76, 32, ..., 17,   0    ,  1

Here, I have a multi-task learning problem (I am quite new in this domain) and I want to train a model/network with target1 and test it with target2 .在这里,我有一个多任务学习问题(我在这个领域很新),我想用target1训练一个模型/网络并用target2测试它。

If we consider target1 and target2 as tasks, they might be related tasks but we do not know how much.如果我们将target1target2视为任务,它们可能是相关的任务,但我们不知道有多少。 So, I want to see how much we can use the model trained by task1 ( target1 ) to predict task2 ( target2 ).所以,我想看看我们可以使用多少由task1( target1 )训练的model来预测task2( target2 )。

It seems, it is not possible since target1 is a binary class (0 and 1), but target2 has more than two values (0,1 and 2).看来,这是不可能的,因为target1是二进制 class(0 和 1),但target2有两个以上的值(0,1 和 2)。 Is there any way to handle this issue?有没有办法处理这个问题?

This is not called Multi-Task Learning but Transfer Learning.这不是多任务学习,而是迁移学习。 It would be multi-task learning if you had trained your model to predict both the target1 and target2 .如果您已经训练 model 来预测target1target2 ,那将是多任务学习。

Yes, there are ways to handle this issue.是的,有办法处理这个问题。 The final layer of the model is just the classifier head that computes the final label from the previous layer. model 的最后一层只是从上一层计算最终 label 的分类器头。 You can consider the output from the previous layer as embeddings of the datapoint and use this representation to train/fine-tune another model.您可以将前一层的 output 视为数据点的嵌入,并使用此表示来训练/微调另一个 model。 You have to plug in another head, though, since you now have three classes.但是,您必须插入另一个头,因为您现在有三个类。

so in pseudo-code, you need something like所以在伪代码中,你需要类似的东西

model = remove_last_layer(model)
model.add(<your new classification head outputting 3 classes>)
model.train()

you can then compare this approach to the baseline, where you train from scratch on target2 to analyze the transfer learning between these two tasks.然后,您可以将此方法与基线进行比较,在基线上您从头开始训练target2以分析这两个任务之间的迁移学习。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 或者在 pytorch 中训练多任务学习模型 - 权重更新 - Alternatively train multi task learning model in pytorch - weight updating 任务 model 与最终用户 model - Task model with EndUser model 芹菜任务调用另一个任务 - Celery task calling another task 如何将图像和注释拆分为 object 检测任务的训练、测试和验证集? - How to split the images and annotations into train, test and validation sets for an object detection task? 如何为特定领域的代表性学习任务训练一个 bert model? - How can I train a bert model for representational learning task that is domain specific? 如何在传递到任务队列之前测试是否保存了模型? - How to test if a model was saved before passed on to task queue? 收到警告:您可能应该在下游任务中训练此模型,以便能够将其用于预测和推理。 加载微调模型时 - Get warning : You should probably TRAIN this model on a downstream task to be able to use it for predictions and inference. when loadin finetune model 使用芹菜从另一个周期性任务中运行任务 - Run task from another periodic task with celery Asyncio - 如何检测另一个任务中的结束任务 - Asyncio - How detect the end task in another task 芹菜-在其他任务结束时安排定期任务 - Celery - Schedule periodic task at the end of another task
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM