简体   繁体   中英

Can I tell a machine learning model that the dependent variable is normally distributed?

I am trying to set up a machine learning model predicting a continuous variable y on the basis of a feature vector (x1, x2, ..., xn). I know from elsewhere that y follows a normal distribution. Can I somehow specify this to the model and enhance its predictions this way? Is there a specific model that allows me to do this?

I have used linear models, k-nearest neighbour models and random forest models (in python). All of them give some predictions but I was wondering whether they can be outperformed by some model that would know the distribution of the predicted variable.

做到这一点的一种方法是创建一个自定义目标函数,该函数对未按正态分布的预测进行惩罚。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM