简体   繁体   中英

How many data did SQL Server affordable?

DB: SQL Server 2014 Standard or higher;

Server: MS Azure, no limited CPU and RAM

We want to design our new back-end architecture, we having approximately 20 millions in a single table, the SQL may like:

select * from xxxx
where (type=1 or type=2 or type=3) and someNumber<5000
order by xxxxxx

The SQL no any relation to other tables. Did this SQL can response immediately? (in 500 to 1000 ms)

When data grown up to 100 millions, can it still affordable?

Or there are some skill to optimize it? (like sql view, cache...etc)

SQL Azure should be good for you. I have had databases on SQL Azure with over 250 million records in a single table that was a Standard 2 tier database which is not very expensive. If you have good indexes and simple queries it will work fine. Ultimately you should create a database and just give it a try.

You can also use table partitioning which could help some with performance and management of such a large table.

At Stackify we manage over 1,000 databases on SQL Azure for our multi-tenant SaaS product.

You query can be more easily written as:

select *
from xxxx
where type in (1, 2, 3) and someNumber < 5000
order by xxxxxx;

This is hard to optimize. You can try an index on xxxx(type, someNumber) . This index would work better on this verison of the query:

select *
from ((select * from xxxx where type = 1 and somenumber < 5000) union all
      (select * from xxxx where type = 2 and somenumber < 5000) union all
      (select * from xxxx where type = 2 and somenumber < 5000)
     ) x
order by xxxxxx;

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM