简体   繁体   中英

Caching updating time series data

I'm trying to use redis to cache time series data of 270 stocks. Every 2 or 3 seconds I get an array of stock changes (trades) that have just happened. I want to save this data in redis so I'm currently trying to think of the best (and most efficient) way to do so.

First I considered having 270 lists in redis where each list is one of the stocks that could get updated and on any update, I add the object to the corresponding list. This has 2 main problems, lets say one of the updates has 10 different stocks that just changed, this means that I'll have to communicate with redis 10 times. The other problem is retrieval, if I want to get the data of all the stocks, then I'll have to communicate with redis 270 times.

The other approach would be to just have one hash which maps to a JSON object with 270 keys, and each value in the object would be an array of updates.

I'm currently favoring the second approach but I'm wondering if theres something else I can do that may be better than these approaches?

@ninesalt you can use Redis pipeline to send batches or commands in order to speed up. I have used it before (in development) with queues of over 10k commands.

You should also take a look at RedisTimeSeries which can make querying much faster. It is optimized for the use case you describe and offers several aggregations which you might find helpful as well.

It depends the qps (query per seconds) .

If it's very high , the second solution has some problems , it doesn't scale ; Multiple servers will query the same key serially ( because single thread of redis). Maybe you can try the timer task which pull the data to local cache periodically.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM