Information Theory: Step 6 Hamming code
This article is the logical suit of the previous article devoted to error correction. The goal is create an error correction algorithm with the help of parity bits (discussed in the previous article).
Hamming codes are a family of linear error-correcting codes that generalize the Hamming (7,4) - code invented by Richard Hamming in 1950.
For each integer r ≥ 2 there is a code with block length n = 2^r−1 and message length k = 2^r−r−1.
(n, k)=(2^r−1, 2^r−r−1)
Hamming code's main goal is to increase the hamming distance, and increase the code rate.
In this article are going to develop (7,4) Hamming code, in other words, to code correctly a message of 4 symbols (k) we need 3 parity bits, thus making a 7 (n) symbols message.
Important: with this configuration the algorithm is only able to correct a single error. However, an error is corrected in the encoded message, thus even if there was en error in parity bits we will be able to detect and correct it.
Thus formula for each parity bit is the following:
first parity bit = i1 XOR i2 XOR i3 (where i is a symbol from original message)
second parity bit = i2 XOR i3 XOR i4
third parity bit = i2 XOR i3 XOR i4
We then calculate the error syndrome which will help us detect erroneous symbol.
S1 = first parity bit XOR i1 XOR i2 XOR i3
S2 = second parity bit XOR i2 XOR i3 XOR i4
S3 = third parity bit XOR i1 XOR i2 XOR i4
Error syndrome = (s1, s2, s3)
And here is the magic: we look for our error syndrome value and detect the error in our message.
where i is a symbol in initial message, and r is a parity bit.
The logic is quite straightforward: we divide our initial message (converted into a set of 0s and 1s using Shanon or Huffman algorithm) by blocks of 4 add parity bits and send it. If there were transmission errors out system will be capable of detecting errors in the received message and correct them. However only error per block may be detected as has been discussed above.
Here's the implementation.
And, as usual, link to github.
Arndt C. Information Measures: Information and its Description in Science and Engineering.
Thomas Cover. Elements Of Information Theory.