"Shorter is better?"
The discussion began with Riku's simple question.
"Not always," Aoi answered.
"Why not?" Yuki tilted their head.
"Let's talk about code length," Aoi faced the whiteboard.
"Code length is the number of bits needed to express a message."
"Shorter means more efficient?" Riku asked.
"On average. But you cannot make all codes short."
Aoi wrote an example.
"There are 4 symbols A, B, C, D. All can be expressed with 2 bits: A: 00 B: 01 C: 10 D: 11"
"But what if A appears 90% of the time?"
Yuki thought. "We want to make only A short."
"Correct. For example: A: 0 B: 10 C: 110 D: 111"
"A is 1 bit, others are 2-3 bits."
Riku calculated. "Average code length is 0.9×1 + 0.033×2 + 0.033×3 + 0.033×3 ≈ 1.1 bits."
"Shorter than the fixed 2 bits!"
"Yes. That's the power of variable-length codes," Aoi explained.
"But caution is needed. Wrong code length choices worsen efficiency."
Yuki asked. "How do you decide optimal length?"
"Based on probability distribution. Ideally, code length should approach -log₂(p)."
"p is probability?"
"Yes. If probability is 1/2, then 1 bit; 1/4, then 2 bits; 1/8, then 3 bits."
Riku asked curiously. "Why logarithm?"
"Because logarithm is the inverse of exponent. In a world of powers of 2, logarithm is natural."
Aoi drew a diagram.
"Probability → Information amount (logarithm) → Code length (integer)"
"Actual code length is information amount rounded to integer."
Yuki wrote in the notebook. "That's why code length becomes slightly longer than entropy."
"Precise. Shannon's source coding theorem guarantees that."
Riku had another question. "So why is it a battle?"
Aoi smiled. "Because code length is a competition for resources."
"Resources?"
"Bit space. Short codes are limited. There are only two 1-bit codes. Four 2-bit codes."
"Who gets the short codes? That's the battle."
Yuki understood. "Frequent symbols battle to win short codes."
"Exactly. Huffman coding is an algorithm that finds the optimal solution to that battle."
Riku drew a picture in his notebook. Symbols competing for short codes.
"Symbol A: 'I'm used most often, give me 1 bit!' Symbol D: 'I'm rare, so I'll endure being long...'"
Yuki and Aoi laughed.
"Riku's personifications are sometimes accurate," Aoi admitted.
Yuki asked seriously. "In the code length battle, who wins?"
"Everyone wins," Aoi answered. "In optimal codes, overall average code length is minimized. Individually unfair, but collectively fair."
"That's utilitarian," Riku said.
"Information theory is often utilitarian. It emphasizes overall efficiency."
Yuki closed the notebook. "The battle over code length is deep."
"Yes. A small battle, but supporting the foundation of communication."
The three quietly nodded. The battle over code length continues today.