Short Story ⟡ Informatics

Your Words are High Entropy

An exploration of entropy, uncertainty, and how information theory helps us understand the world.

  • #entropy magnitude
  • #predictability
  • #information density
  • #language complexity

"Riku, I didn't understand your explanation at all."

Yuki was confused.

"What? I explained it super carefully!" Riku protested.

Aoi intervened. "Riku's explanations have too much entropy."

"Too much entropy?"

"Unpredictable. You can't tell what's coming next."

Mira quietly opened her notebook. Riku's explanation was transcribed there.

"So, encoding is, you know, the thing that makes data smaller. But it's different from encryption, or maybe similar. Anyway, compression? Or maybe efficiency?"

"As you can see, the sentence structure is undefined. Topics jump around. High uncertainty," Aoi analyzed.

"But in my head, it's all connected."

"That's the problem. There's a gap between the sender's internal state and the receiver's interpretation."

Yuki took notes. "Don't high-entropy words carry more information?"

"Good question," Aoi nodded. "Theoretically, high-entropy messages can carry more information. But that assumes the receiver can interpret correctly."

Mira wrote an equation. "H(X|Y) = H(X) - I(X;Y)"

"Conditional entropy. X's uncertainty after knowing Y. If Y causes confusion, uncertainty might actually increase."

Riku looked unconvinced. "Then should I speak with low entropy?"

"Extremely low is also a problem," Aoi gave another example. "'Yes, yes, I see, I understand.' This is too predictable, with almost zero information content."

"Then what should I do?"

"Moderate entropy. Depend on context while adding new information. Balance between predictable parts and surprising parts."

Yuki looked back at the notebook. "Senpai, your explanations are always easy to understand."

"Maybe because the entropy is optimized."

Mira showed another paper. Aoi's explanation was transcribed there.

"Encoding is a technique to represent data efficiently. Unlike encryption, the purpose is efficiency, not secrecy."

"Clear structure. Consistent topic. Appropriate redundancy," Aoi self-analyzed.

Riku laughed. "Senpai, you analyze your own explanations?"

"Metacognition is important."

Yuki pondered. "Then does everyone have a different optimal entropy level?"

"Yes. It depends on the receiver's knowledge, context, and expectations. The same message can have different entropy for different people."

Mira wrote an additional note. "Compression requires understanding"

"Compression requires understanding," Aoi translated. "More shared knowledge means more efficient communication."

Riku asked seriously. "So I didn't have enough shared knowledge with Yuki."

"Probably. But that's something you cultivate over time."

Yuki smiled. "I feel like we increased our shared knowledge a bit in today's conversation."

"Mutual information increased," Aoi nodded.

Riku stood up. "Okay, I'll try explaining again. This time with low entropy."

"Wait," Aoi stopped him. "I said too low is also bad."

"Then medium entropy?"

"Moderately."

Mira smiled and left a note. "Optimal entropy = understanding + surprise"

Optimal entropy is the sum of understanding and surprise.

Yuki read it and nodded deeply. Communication might be entropy adjustment. Balancing predictability and novelty to suit the other person. That's probably what getting through means.