简体   繁体   中英

How much data can be Cached for better performance

I have a table in SQL server with over million rows. I just want to create a cache and store whole data in cache. Should I store all million rows there. Is it a good practice. If not, how should I proceed?

EDIT: We are not writing anything on this table. Its for read purpose only.

Contrary to the comments which offers that sql s will keep the records into the buffer, I have to say that this is not a solution to improve the performance. You cannot count on the sql cache because

it caches only:

  • Query plans
  • pages from the database files

but does NOT cache: results from a query See this post

Now lets discuss your questions:

puting data into the cache needs a strategy that depends on many factors: how frequent these 1 milion records are fetched ? are they fetched in big bunch of data ? our you have some queries that fetches a part of it.

Then you need a strategy for invalidation. and you need to answer these questions:

is it ok to cache data for couple of hours ?

in general is it ok to cache based on time or you need other invalidation strategy ?

And where is your cache ? in heap memory or an a distributed cache ?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM