简体   繁体   中英

Need help understanding code to count bits set to 1

I came across a StackOverflow answer that gives the following code to efficiently count the number of bits that are set to 1 in a 32-bit int :

int NumberOfSetBits(int i)
{
    i = i - ((i >> 1) & 0x55555555);
    i = (i & 0x33333333) + ((i >> 2) & 0x33333333);
    return (((i + (i >> 4)) & 0x0F0F0F0F) * 0x01010101) >> 24;
}

But I had lot of issues in understanding this. I couldn't find a link where it's explained properly. Can anyone help me out here in understanding this piece of code, or provide a link which could be more helpful?

Answering somewhat indirectly: a great reference on how bit twiddling routines like this (and hundreds of others) work is the book "Hacker's Delight" by Henry Warren. I highly recommend it -- it belongs on every programmer's bookshelf. http://www.amazon.com/Hackers-Delight-Henry-S-Warren/dp/0201914654

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM