## Saturday, September 26, 2009

### Even Parity -- comparison of two algorithms

To set a ASCII character to even parity, we first count the number of bits with the value of 1. If the number is even, the ASCII character is already of even parity. Otherwise, we set the most significant bit of the ASCII character to 1 to make it even parity. The follow code (in Java) demonstrates that:
``````
public static byte setEvenParity(byte b)
{
short countOne = 0;

``````if (0 != (b & 0x01)) ++countOne;
``````if (0 != (b & 0x02)) ++countOne;
``````if (0 != (b & 0x04)) ++countOne;
``````if (0 != (b & 0x08)) ++countOne;
``````if (0 != (b & 0x10)) ++countOne;
``````if (0 != (b & 0x20)) ++countOne;
``````if (0 != (b & 0x40)) ++countOne;

``````byte evenB = (1 == (countOne % 2)) ? (byte)(b | 0x80) : b;

``````return evenB;
}``````

Can we do a little improvement to it?

As we can see, when we convert the same byte twice, the same processes of checking and setting the most significant bit are repeated. What if we remember what we have done in the first time and apply the result to the second time. For example, the first time we see character 'T' whose binary form is "0101 0100", we check and find out that it has 3 bits with the value of 1. We then set its most significant bit as 1 to make it as "1101 0100". When we see 'T' again, we should know immediately that the output should be "1101 0100" without doing the checking again.

Or even better, if we already know who need setting even parity bit and who need not before we process any string. After all, there are only 128 ASCII characters. So all we need is a preset table. When we see character 'T', we check the table and get "1101 0111".
``````
private static int[] withEvenParity = {
0x00,0x81,0x82,0x03,0x84,0x05,0x06,0x87,0x88,0x09,0x0A,0x8B,0x0C,0x8D,0x8E,0x0F,
0x90,0x11,0x12,0x93,0x14,0x95,0x96,0x17,0x18,0x99,0x9A,0x1B,0x9C,0x1D,0x1E,0x9F,
0xA0,0x21,0x22,0xA3,0x24,0xA5,0xA6,0x27,0x28,0xA9,0xAA,0x2B,0xAC,0x2D,0x2E,0xAF,
0x30,0xB1,0xB2,0x33,0xB4,0x35,0x36,0xB7,0xB8,0x39,0x3A,0xBB,0x3C,0xBD,0xBE,0x3F,
0xC0,0x41,0x42,0xC3,0x44,0xC5,0xC6,0x47,0x48,0xC9,0xCA,0x4B,0xCC,0x4D,0x4E,0xCF,
0x50,0xD1,0xD2,0x53,0xD4,0x55,0x56,0xD7,0xD8,0x59,0x5A,0xDB,0x5C,0xDD,0xDE,0x5F,
0x60,0xE1,0xE2,0x63,0xE4,0x65,0x66,0xE7,0xE8,0x69,0x6A,0xEB,0x6C,0xED,0xEE,0x6F,
0xF0,0x71,0x72,0xF3,0x74,0xF5,0xF6,0x77,0x78,0xF9,0xFA,0x7B,0xFC,0x7D,0x7E,0xFF,
};

public static byte fasterSetEvenParity(byte b)
{
``````int i = b & 0x7f;

``````byte evenB = (byte)withEvenParity[i];

``````return evenB;
}

``````

Note: we can use the first algorithm to generate the withEvenParity[] array.

Is the second algorithm really faster? I ran a little test with this program and the result below.
``````
public static void main(String arg[])
{
``````String s = "This is a test string.";

``````byte[] bytes = s.getBytes();

``````for (int i = 0; i < 100000000; ++i)
``````{
````        ````for (int j = 0; j < bytes.length; ++j)
````        ````{
````             ````byte result =
````               ``  ````         setEvenParity(bytes[j]);
````               ``           ````//fasterSetEvenParity(bytes[j]);

````             ````if (0 == i)
````               ``  ``System.out.print(Integer.toHexString(bytes[j] & ````0xff)
````               ``  ````      + ":" + Integer.toHexString(result & 0xff) + " ");
````        ````}
````    ````}

````    ````System.out.println();
}``````

Using the first algorithm, the running time of the program is:
``` real 0m23.156s user 0m22.863s sys 0m0.015s ```

Using the second algorithm, the running time of the program is:
``` real 0m6.838s user 0m6.692s sys 0m0.026s```

Yeah! The faster one is really faster.

Read my other post of Print byte in hex with the similar idea to speed things up.