简体   繁体   中英

fastest algorithm to get average change

I have a sorted dictionary where the key is a date and the value is a integer that represents time left. i have 3 years worth of data

so it would be something like

Key: 2009-1-1, Value: 100
Key: 2009-1-2, Value: 97
Key: 2009-1-3, Value: 92
Key: 2009-1-4, Value: 87 ...
...
Key: 2009-1-30, Value: 0

I would like to calculate average change per day and wanted to see if there way any elegant way of doing this.

If the values are strictly descending, then the average change per day is:

 total change = difference between time on last day and time on first day
 average change = total change / number of days

The whole thing can be calculated in O(1), provided you know the size of your dictionary.

How to do it in code...

This will handle increasing and decreasing:

        int changeTot = 0;
        int lastVal = 0;
        bool first = true;

        foreach (int val in myDict.Values)
        {
            if (!first) changeTot += val - lastVal;
            lastVal = val;
            first = false;
        }

        double avg = (double)changeTot / myDict.Count;

Of course this is O(n) since you're only going through the array once.

If your values are only increasing or only decreasing

You can use a bit of Linq:

double avg = (double)(myDict.Last().Value - myDict.First().Value) / myDict.Count();

This would be O(1)

Are you doing this more than once? You could keep track of the average change, and every time you insert, remove, or alter an element in the dictionary, you could update the average change just by looking at the element's two neighbors.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM