"Is there a way to communicate without waste?"
To Yuki's question, Aoi nodded. "The source coding theorem. A beautiful theorem proven by Shannon."
Mira opened her notebook and listened quietly.
"The entropy H of the information source determines the lower bound for average code length," Aoi explained. "No matter how cleverly you compress, you can't go below H."
"So there's a theoretical limit?"
"Yes. But conversely, you can get arbitrarily close to H. That's optimal coding."
Aoi drew an example on the whiteboard.
"English text. For 26 alphabet letters, fixed-length needs 5 bits. But actual English entropy is about 1.5 bits per character."
"Less than a third!" Yuki was surprised.
"Because frequency is skewed. 'e' and 't' are frequent, 'z' and 'q' are rare. This skew enables compression."
Mira wrote. "Uniform distribution: no compression"
"Correct," Aoi acknowledged. "Completely random data can't be compressed. Because entropy is maximum."
Yuki thought. "So passwords are better if they can't be compressed?"
"Sharp. The less compressible, the higher the entropy and harder to guess. A good password condition."
Aoi introduced a new concept. "Lossless and lossy compression. Lossless can be perfectly restored. Lossy discards information."
"The difference between ZIP and JPEG," Yuki understood.
"Yes. ZIP is lossless, JPEG is lossy. JPEG removes information humans don't notice."
Mira supplemented. "Lossy compression: perceptual coding"
"Perceptual coding," Aoi continued explaining. "Utilizes limits of human sensory organs. Inaudible high frequencies, imperceptible fine color differences. Removing these doesn't affect perception."
"But information is lost," Yuki confirmed.
"Right. That's why medical images and legal documents use lossless compression. Data that can't be lost."
Aoi drew a graph. The relationship between compression ratio and quality.
"Lossy compression achieves high compression by sacrificing quality. A tradeoff."
Yuki wrote in the notebook. "Lossless compression's limit is entropy. Lossy depends on what you discard."
"Perfect summary."
Mira quietly stood up and wrote on the whiteboard.
"Communication = source coding + channel coding"
"Yes," Aoi was impressed. "Optimal communication has two stages. First compress the information source. Then add error correction codes."
"Compression and redundancy, doing opposite things," Yuki noticed.
"Beautiful symmetry. Source coding removes waste, channel coding adds intentional redundancy."
"Overall?"
"Communication near entropy efficiency, robust to noise can be achieved. The theoretical framework Shannon showed."
Yuki pondered. "Does human conversation have the same structure?"
"Interesting perspective. The brain compresses what it wants to say and converts it to words. Simultaneously, to prevent misunderstanding, adds redundant expressions."
"Like 'in other words' or 'that is to say.'"
"Yes. That's natural error correction."
Mira smiled and wrote. "Optimal communication: balance of efficiency and robustness"
Yuki nodded. "Balance of efficiency and robustness. That's optimal communication."
The sunset illuminated the club room. An afternoon thinking about optimal communication. It was time touching the core of information theory.