In Shannon’s world,
Entropy is the average amount of information produced by a source. It is also the minimum number of bits required, on average, to encode the source without losing any information.
If you receive a 7-bit string, you run the parity checks. The result (called the syndrome) is a binary number from 001 to 111. That number tells you exactly which bit to flip to fix the message.