简体   繁体   中英

Calculate moments (mean, variance) of distribution in python

I have two arrays. x is the independent variable, and counts is the number of counts of x occurring, like a histogram. I know I can calculate the mean by defining a function:

def mean(x,counts):
    return np.sum(x*counts) / np.sum(counts)

Is there a general function I can use to calculate each moment from the distribution defined by x and counts? I would also like to compute the variance.

You could use the moment function from scipy . It calculates the n-th central moment of your data.

You could also define your own function, which could look something like this:

def nmoment(x, counts, c, n):
    return np.sum(counts*(x-c)**n) / np.sum(counts)

In that function, c is meant to be the point around which the moment is taken, and n is the order. So to get the variance you could do nmoment(x, counts, np.average(x, weights=counts), 2) .

import scipy as sp
from scipy import stats
stats.moment(counts, moment = 2) #variance

stats.moment returns nth central moment.

Numpy supports order statistics now

https://numpy.org/doc/stable/reference/routines.statistics.html

  • np.average
  • np.std
  • np.var etc

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM