简体   繁体   English

逻辑回归“概率”函数(不是有效的 pdf ......)

[英]Logistic regression "probability" function (is not a valid pdf...)

The idea behind logistic regression is to estimate the posterior class conditional probability, given observation x for class C_k , with a sigmoid f(C_k| x)=1/(1+exp(-w*x)) to compute the weights vector w .逻辑回归背后的想法是估计后验类条件概率,给定类C_k观察x ,使用 sigmoid f(C_k| x)=1/(1+exp(-w*x))来计算权重向量w .

In every book I've read (eg, Bishop's PRML) f(C_k| x) is a probability density function but this is definitely not a valid pdf since the integral from minus infinity to infinity does not equal to 1 (nor it could be by any normalization since the integral is infinite).在我读过的每一本书(例如,Bishop 的 PRML)中, f(C_k| x)是一个概率密度函数,但这绝对不是一个有效的 pdf,因为从负无穷到无穷的积分不等于 1(也不可能是通过任何归一化,因为积分是无限的)。

Appreciate any explanations in this matter感谢对此问题的任何解释

You got it wrong, there is not integral from -inf to +inf.你弄错了,从 -inf 到 +inf 没有积分。 It is a discrete distribution p(c_k | x), and in case of logistic regression you have two classes c=1 and c=0.它是一个离散分布 p(c_k | x),在逻辑回归的情况下,你有两个类 c=1 和 c=0。 The model outputs the probability of belonging to class c=1.该模型输出属于类别 c=1 的概率。 If you subtract p(c = 1 | x) from 1, you get the probability of the other class: p(c = 0 | x) = 1 - p(c = 1 | x).如果你从 1 中减去 p(c = 1 | x),你会得到另一个类别的概率:p(c = 0 | x) = 1 - p(c = 1 | x)。 Softmax regression extends this to more than two classes by applying the softmax instead of the sigmoid or logistic function. Softmax 回归通过应用 softmax 而不是 sigmoid 或逻辑函数将其扩展到两个以上的类别。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM