Basic Programming
Topics:
Basics of Bit Manipulation

Basics of Bit Manipulation

Bits are the fundamental units of data storage in any computer based system. Large number of bits come together to form bytes, large number of bytes come together to form kilo-bytes(KB) and so on. Bits have a very important role to play in solving various problems.

It is known that $$1$$ byte comprises $$8$$ bits. Any integer or character can be represented using bits in computers, which is known as its binary form and contains only $$1$$ or $$0$$.

Example:
1) $$14_{2}$$ = $$1110$$ = $$1 \times 2^{3} + 1 \times 2^{2} + 1 \times 2^{1} + 0 \times 2^{0}$$
= $$14$$

2) $$20_{2}$$ = $$10100$$
= $$1 \times 2^4 + 0 \times 2^3 + 1 \times 2^2 + 0 \times 2^1 + 0 \times 2^0$$ = $$20$$

For characters, we can use ASCII representation, which is in the form of integers. These integers can be represented by using the bits as described in the examples above.

Bitwise Operators:

Bitwise operators are used to perform operations on individual bits. Just as you use the addition, subtraction operators on real numbers, similarly bitwise operators are used to do the same with individual bits.

The commonly-used bitwise operators are as follows:

NOT ( ~ ): Bitwise NOT is an unary operator that flips the bits of the number i.e., if the $$i^{th}$$ bit is $$0$$, it will change it to $$1$$ and vice versa. Bitwise NOT is nothing but simply the one’s complement of a number. Consider an example.
$$N$$ = $$5_{2}$$ = $$101$$
~$$N$$ = ~$$5_{2}$$ = ~$$101$$ = $$010$$=$$2_{2}$$

AND ( & ): Bitwise AND is a binary operator that operates on two equal-length bit patterns. If both bits in the compared position of the bit patterns are $$1$$, the bit in the resulting bit pattern is $$1$$, otherwise $$0$$.
$$A$$ = $$5_{2}$$ = $$101$$, $$B$$ = $$3_{2}$$ = $$011$$
$$A \& B$$ = $$101$$ & $$011$$= $$001$$=$$1_{2}$$

OR ( | ): Bitwise OR is also a binary operator that operates on two equal-length bit patterns, similar to bitwise AND. If both bits in the compared position of the bit patterns are $$0$$, the bit in the resulting bit pattern is $$0$$, otherwise $$1$$.
$$A$$ = $$5_{2}$$ = $$101$$ , $$B$$ = $$3_{2}$$ = $$011$$
$$A | B$$ = $$101$$ | $$011$$ = $$111$$=$$7_{2}$$

XOR ( ^ ): Bitwise XOR also takes two equal-length bit patterns. If both bits in the compared position of the bit patterns are $$0$$ or $$1$$, the bit in the resulting bit pattern is $$0$$, otherwise $$1$$.
$$A$$ = $$5_{2}$$ = $$101$$ , $$B$$ = $$3_{2}$$ = $$011$$
$$A$$^$$B$$ = $$101$$ ^ $$011$$ =$$110$$= $$6_{2}$$

Left Shift ( << ): Left shift operator is a binary operator which shifts some number of bits, in the given bit pattern, to the left and append $$0$$ at the end. Left shift is equivalent to multiplying the bit pattern with $$2^k$$ (if we are shifting $$k$$ bits).

$$1 << 1$$ = $$2$$ = $$2^1$$

$$1 << 2$$ = $$4$$ = $$2^2$$

$$1 << 3$$ = $$8$$ = $$2^3$$

$$1 << 4$$ = $$16$$ = $$2^4$$

$$1 << n$$ = $$2^n$$

Right Shift ( >> ): Right shift operator is a binary operator which shifts some number of bits, in the given bit pattern, to the right and append $$0$$ at the beginning. Right shift is equivalent to dividing the bit pattern with $$2^k$$ (if we are shifting $$k$$ bits).

$$4 >> 1$$ = $$2$$

$$6 >> 1$$ = $$3$$

$$5 >> 1$$ = $$2$$

$$16 >> 4$$ = $$1$$

Truth Table $$$Truth \space Table$$$


Bitwise operators are good for saving space and sometimes cleverly removing dependencies. Let’s look at this simple application of a technique for finding the number of 1s in the binary representation of a number. The basic approach of evaluating the binary form of a number is to traverse on it and count the number of 1s. However, this approach takes $$log_2N$$ time for every $$N$$.

Why $$log_{2} N$$ time?

To get a number in its binary form, we have to divide it by $$2$$ until it reaches $$0$$, which will take $$log_2N$$ time.
With bitwise operations, we can use an algorithm whose running time depends on the number of 1s that are present in the binary form of the given number. This algorithm is much better, as it will reach $$log_2N$$ (only in its worst case).

    int count_one (int n)
    {
        while(n)
        {
            n = n & (n-1);
            count++;
        }
        return count;
    }

Why does this algorithm work ?

In the binary representation of $$x$$ and $$x-1$$, all the bits are same except the rightmost set bit of $$x$$, and all the bits to the right of it. If binary representation of $$x$$ is $$a1b$$, where $$a$$ is a binary sequence and $$b$$ is a sequence of $$0s$$, then, $$x-1$$, will be $$a0c$$, where $$c=$$ ~ $$b$$. Using Bitwise AND on $$x$$ and $$x-1$$, will give $$a0b$$. Clearly, this unsets the last set bit of $$x$$. If we keep doing this and updating $$x$$ each time, then we will unset all the set bits of $$x$$, and the value of $$x$$ will ultimately become $$0$$. The number of times we do this will give us the number of $$1s$$ in the binary representation of $$x$$.

Example
$$n=23_2=10111$$. Initial $$count=0$$

  1. $$n$$ will change to $$n\&(n-1)$$. As $$n−1=22_2=10110$$, $$n\&(n-1)$$ will be $$10111 \& 10110$$, which is $$10110=22_2$$. Therefore $$n$$ and the $$count$$ will change to $$22$$ and $$1$$ respectively.

  2. As $$n−1=21_2=10101$$, $$n\&(n-1)$$ will be $$10110\&10101$$, which is $$10100=20_2$$. Therefore, $$n$$ and the $$count$$ will change to $$20$$ and $$2$$ respectively.

  3. As $$n−1=19_2=10011$$, then $$n\&(n-1)$$ will be $$10100\&10011$$, which is $$10000=16_2$$. Therefore, $$n$$ and the $$count$$ will change to $$16$$ and $$3$$ respectively.

  4. As $$n−1=15_2=01111$$, then $$n\&(n-1)$$ will be $$10000\&01111$$, which is $$00000=0_2$$. Therefore, $$n$$ and the $$count$$ will change to $$0$$ and $$4$$ respectively. As $$n=0$$, the loop will terminate and output the result $$4$$.

Complexity
$$O(K)$$, where $$K$$ is the number of $$1s$$ that are present in the binary form of the given number.

Contributed by: Anand Jaisingh
Notifications
View All Notifications