简体   繁体   English

我可以将方法定义为属性吗?

[英]Can I define a method as an attribute?

The topic above is a bit ambiguous, the explaination is below:上面的题目有点含糊,解释如下:

class Trainer:
    """Object used to facilitate training."""

    def __init__(
        self,
        # params: Namespace,
        params,
        model,
        device=torch.device("cpu"),
        optimizer=None,
        scheduler=None,
        wandb_run=None,
        early_stopping: callbacks.EarlyStopping = None,
    ):
        # Set params
        self.params = params
        self.model = model
        self.device = device

        # self.optimizer = optimizer
        self.optimizer = self.get_optimizer()
        self.scheduler = scheduler
        self.wandb_run = wandb_run
        self.early_stopping = early_stopping

        # list to contain various train metrics
        # TODO: how to add more metrics? wandb log too. Maybe save to model artifacts?

        self.history = DefaultDict(list)

    @staticmethod
    def get_optimizer(
        model: models.CustomNeuralNet,
        optimizer_params: global_params.OptimizerParams(),
    ):
        """Get the optimizer for the model.

        Args:
            model (models.CustomNeuralNet): [description]
            optimizer_params (global_params.OptimizerParams): [description]

        Returns:
            [type]: [description]
        """
        return getattr(torch.optim, optimizer_params.optimizer_name)(
            model.parameters(), **optimizer_params.optimizer_params
        )

Notice that initially I passed in optimizer in the constructor, where I will be calling it outside this class.请注意,最初我在构造函数中传入了optimizer ,我将在此 class 之外调用它。 However, I now put get_optimizer inside the class itself (for consistency purpose, but unsure if it is ok).但是,我现在将get_optimizer放入 class 本身(出于一致性目的,但不确定是否可以)。 So, should I still define self.optimizer = self.get_optimizer() or just use self.get_optimizer() at the designated places in the class?那么,我应该仍然定义self.optimizer = self.get_optimizer()还是只在 class 的指定位置使用self.get_optimizer() The former encourages some readability for me.前者鼓励了我的一些可读性。


Addendum: I now put the instance inside the .fit() method where I will call say 5 times to train the model 5 times.附录:我现在将实例放在.fit()方法中,我将调用 5 次来训练 model 5 次。 In this scenario, even though there won't be any obvious issue as we are using optimizer once per call, will it still be better to not define self.optimizer here?在这种情况下,即使我们每次调用都使用一次优化器不会有任何明显的问题,但在这里不定义self.optimizer会更好吗?

    def fit(
        self,
        train_loader: torch.utils.data.DataLoader,
        valid_loader: torch.utils.data.DataLoader,
        fold: int = None,
    ):
        """[summary]

        Args:
            train_loader (torch.utils.data.DataLoader): [description]
            val_loader (torch.utils.data.DataLoader): [description]
            fold (int, optional): [description]. Defaults to None.

        Returns:
            [type]: [description]
        """
        self.optimizer = self.get_optimizer(
            model=self.model, optimizer_params=OPTIMIZER_PARAMS
        )
        self.scheduler = self.get_scheduler(
            optimizer=self.optimizer, scheduler_params=SCHEDULER_PARAMS
        )

There is a difference between the two: calling your get_optimizer will instantiate a new torch.optim.<optimizer> every time.两者之间是有区别的:调用你的get_optimizer都会实例化一个torch.optim.<optimizer> In contrast, setting self.optimizer and accessing it numerous times later will only create a single optimizer instance.相反,设置self.optimizer并在以后多次访问它只会创建一个优化器实例。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM