Short Story ⟡ Informatics

The Secret Hidden in Noisy Messages

Learning the fundamental unit of information and how everything can be measured in bits.

  • #error correction
  • #hamming distance
  • #redundancy
  • #parity bits

"Can you read this?"

Riku held out an old piece of paper. The ink had smudged, and several characters were illegible.

"W■ll, inf■rmati■n the■ry..." Yuki tried to read.

"'Well, information theory is,' right?" Aoi filled in.

"How do you know?"

"Context. Inferring missing parts from surrounding information. That's the power of redundancy."

At that moment, Mira quietly approached and wrote in her notebook. "Error correction through redundancy"

Aoi nodded. "Yes. In communication, noise corrupts information. But by adding redundancy, we can restore the original message."

Riku asked curiously. "Redundancy means waste, right?"

"Usually. But in communication, intentional waste becomes important."

Yuki opened the notebook. "Are there concrete examples?"

Aoi drew a simple diagram.

"Say we send a 3-bit message '101'. But the channel has noise, and 1 bit might flip."

"So it could change to '001' or '111'?"

"Yes. The receiver doesn't know if there's an error. But what if we repeat each bit three times and send '111000111'?"

Riku thought. "Even if 1 bit breaks, majority voting reveals the original bit!"

"Correct. That's the simplest error correction code. Called repetition code."

Mira wrote another example. "Hamming code (7,4)"

"Hamming code," Aoi continued explaining. "Add 3 parity bits to 4 information bits, making 7 bits. Can correct 1-bit errors."

Yuki was fascinated. "What's parity?"

"Check bits. For example, with even parity, you add bits so the number of 1s becomes even."

Aoi wrote on the whiteboard.

"Data: 1011 Parity: Number of 1s is 3 (odd) Even parity bit: Add 1 to make it even Transmit: 10111"

"If you receive 10110, the number of 1s is even, so you can detect an error."

Riku clapped. "But you can't tell where the error is, right?"

"Impossible with simple parity. But Hamming code uses multiple parities to locate the error."

Mira drew a new diagram. Complex matrices and bit arrangements.

Aoi supplemented. "There's a concept called Hamming distance. How many bits two bit strings differ by. Error correction capability is determined by the code's Hamming distance."

Yuki calculated. "101 and 110 have Hamming distance 2?"

"Exact. They differ by 2 bits. Generally, a code with distance d can correct up to (d-1)/2 bit errors."

Riku's face became serious. "So if we add lots of redundancy, we can fix any error?"

"Theoretically. But transmission efficiency drops. Another tradeoff."

Mira showed a note. "Shannon limit: error-free communication near capacity"

"Yes. Reaching Shannon's limit allows nearly error-free, high-speed communication. But finding perfect codes is difficult."

Yuki suddenly thought of something. "Is natural language like an error correction code?"

Aoi was impressed. "Sharp observation. Redundancy in grammar, context, and vocabulary provides natural error correction."

"That's why we could read 'w■ll' as 'well'."

"Exactly. The human brain is a powerful error correction decoder."

Riku looked at the original old paper again. "Then maybe we can read all of this."

The four worked together, inferring smudged characters. Context, grammar, probability. Naturally using information theory principles.

"Done!" Yuki exclaimed.

The paper read:

"Information theory is technology to overcome noise. Perfect communication doesn't exist, but we can approach the limit."

Mira smiled. It was rare.

"Maybe this paper itself was a demonstration of error correction," Aoi said.

"Meta," Riku laughed.

Yuki carefully put away the paper. In noise lies the value of information. Today, they learned that firsthand.