简体   繁体   中英

How do you compute for the moving average of a given dataset in C#?

So I use a randomly generated dataset and I need to find the moving average of the sample size the user inputted. For example, the dataset is a list with {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} and the sample size the user inputted is 2. The program must calculate first the mean of: 1 and 2 = 1.5, 2 and 3 = 2.5, 3 and 4 = 3.5, and so on. How do I do this? Thanks!

You can keep track of the sum and queue up all the values so you know what to subtract from the sum once you get to the sample size.

public static IEnumerable<decimal> MovingAverages(IEnumerable<decimal> values, int sample)
{
    var queue = new Queue<decimal>(sample);
    decimal sum = 0;
    foreach(var x in values)
    {
        sum += x;
        queue.Enqueue(x);
        if(queue.Count == sample)
        {
            yield return sum / sample;
            sum -= queue.Dequeue();
        }
    }
}

Basically this will queue up the first n values (equal to sample) and the sum. Once it gets the desired number of values to average it yields the sum divided by the sample size and then removes the oldest value from the queue and subtracts it from the sum. Note that if the sample size is larger than the number of values this will return an empty enumerable.

This can be done with a simple loop.

for (int i = 0; i <= array.Length - samplesize; i++)
    Console.WriteLine(array.Skip(i).Take(samplesize).Average());

The Skip(i).Take(samplesize) portion selects only the elements you are interested in at the moment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM