[英]Python: Apply .apply() with a self-defined function to a Data Frame- why doesn't it work?
I am trying to apply a self-defined function by using apply() to a data frame.我正在尝试通过对数据框使用 apply() 来应用自定义的 function。 Goal is to calculate the mean of each row / column with a self-defined function. But it doesn't work, probably I still don't understand the logic of.apply() fully.目标是用自定义的 function 来计算每一行/列的平均值。但是它不起作用,可能我仍然没有完全理解 .apply() 的逻辑。 Can someone help me?有人能帮我吗? Thanks in advance:提前致谢:
d = pd.DataFrame({"A":[50,60,70],"B":[80,90,100]})
def m(x):
x.sum()/len(x)
return x
d.apply(m(),axis=0)
If possible the best way is a vectorized solution:如果可能,最好的方法是矢量化解决方案:
df = d.sum() / len(d)
Your solution is possible too, but you need to change to return the values, and also in apply
remove ()
, finally axis=0
is the default value for that parameter, so it can also be removed:您的解决方案也是可能的,但是您需要更改以返回值,并且在apply
remove ()
中,最后axis=0
是该参数的默认值,因此也可以将其删除:
def m(x):
return x.sum()/len(x)
df = d.apply(m)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.