简体   繁体   中英

SQL Identity Increment and Seed

I'm using Microsoft SQL Server for my database.

There is an ID column in my table. In my usage, rows are deleted and re-created many times. So, if I use Identity Increment and Identity Seed of SQL Server, ID numbers will be very big after some time...

It is not important that every new row have an ID that is bigger than other rows. Just it must be unique.

How to do that in SQL Server? Should I disable Automatic Increment and set ID manually? how?

I'm using C# and SQL Server Express.

I'm not sure why you care if the numbers are big or small (hopefully users don't have sentimental value or place any kind of meaning on identity values), but one way to avoid exhaustion problems is to use a BIGINT. I forget the exact numbers but if you generate something like 1000 IDs a second it would take 80 years or so to hit the limit - and you can double this if you start at the negative boundary. Yes BIGINT is 8 bytes instead of 4 but this is still much smaller and more usable than a GUID. And if you combine this with data compression, you won't require any more storage than an INT until well after you've used 2 billion numbers.

Don't over-engineer this, and don't make the mistake that an identity value's size or value should mean something. This is a surrogate value generated for internal identification and efficiency only. If you're telling users about this value, there's something not right.

you can try with this code

--Disable identity SET IDENTITY_INSERT YourSchema.YourTable OFF GO

Unless you want to mess about a lot, ie only add thingies in a stored proc, either using select Max(ID) or a table with next_ID in it, coping with multi-user access etc, you are stuck with this in 2005 (SQL 2012 introduces sequences, apparently).

Other option, is renumber the ids and reseed the identity, but that's a lot of work as well.

Besides big numbers aren't a problem, they only take up four bytes, same as small ones. Rolling over is an issue of course, but it's going to take a while to do that !

You could flip over to a GUID, which will scale, but they make for a very inefficient index, so unless you are going to have more than 2^31 records not worth it just for this either.

As someone else mentioned Bigint would give you greater range, but of course even bigger numbers. :(

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM