简体   繁体   中英

How many years of millisecond timestamps can be represented by 41 bits?

I'm looking at the Instagram blog post about sharded ID generation . This blog post describes generating 64-bit identifiers. Their mechanism allocates 41 of the 64 bits to a millisecond timestamp, and they say:

  • 41 bits for time in milliseconds (gives us 41 years of IDs with a custom epoch)

Is this a typo? I calculated that you can store 69 years of millisecond timestamps in 41 bits. Here's how:

  • Max milliseconds stored in 41 bits: (2^41)-1 = 2199023255551 ms
  • Divided by (1000 * 60 * 60 * 24 * 365 ) ms/year = 69 years

So, where am I wrong?

You're not wrong about the calculation.

(2^41)-1 ms
    == 2199023255.551 s
    == 610839.7932086 hr 
    == 25451.65805036 days 
    == 69.6828 Julian years 
    == 69.6843 Gregorian Years

Which lines up closely with your result ( 69 years ).

However, the website you link to does say that 41 bits gives them

41 years of IDs with a custom epoch

"Epoch" in this context is probably referring to the start date. Given that that article was published "3 years ago", or in 2012 , we can calculate that their epoch begins in 2012 + 41 - 69 == 1984 . This date was possibly chosen as a reference .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM