Short Story ⟡ Informatics

Random-senpai and Determinism-kun

An exploration of entropy, uncertainty, and how information theory helps us understand the world.

  • #entropy
  • #uncertainty
  • #randomness
  • #predictability

"Hey, Aoi-senpai. Is a coin toss really random?"

In the after-school information theory club room, Yuki asked while spinning a coin on the desk.

"Good question," Aoi opened their notebook and pointed to a carefully drawn diagram. "Physically, it might be deterministic. But in information theory, we call unpredictable results 'random.'"

"So it's a matter of knowledge?"

"Exactly. Have you heard of the concept of entropy?"

Yuki shook their head. Aoi began drawing a simple diagram.

"Imagine two boxes. One contains only 100 white balls. The other has 50 white and 50 black balls. When drawing a ball from each box, which result is more difficult to predict?"

"The one with 50 of each."

"Correct. That 'unpredictability' quantified is entropy. A researcher named Shannon formalized it."

At that moment, Riku burst through the door.

"Oh no! I'm late! What are we doing in club today?"

"Talking about randomness and entropy," Aoi answered.

"Random? I'm super random! People say my actions are unpredictable."

Yuki laughed. "But Riku, you're always late, right? That's predictable, isn't it?"

"Ugh, true."

Aoi wrote an equation in the notebook. "H(X) = -Σ p(x) log p(x). This is the entropy formula. For an event x occurring with probability p(x), it represents the average uncertainty."

"Sounds difficult..." Yuki's expression clouded.

"Let's think of a concrete example. Say the probability of Riku arriving on time is 0.1, and the probability of being late is 0.9."

"Too realistic!" Riku exclaimed.

"In this case, the entropy is about 0.47 bits. But if Riku were always certainly late, the probability would be 1.0, and the entropy would be 0."

"When it's certain, entropy is zero?" Yuki was surprised.

"Yes. Events that are completely predictable have no informational surprise. Zero uncertainty."

Riku pondered. "Then should I act more randomly?"

"That would be problematic in a different way," Aoi smiled. "But it's an interesting perspective. In communication systems, high-entropy signals are difficult to compress. Conversely, low-entropy signals—those with patterns—can be compressed."

Yuki's eyes lit up. "So that's why zip files can make patterned data smaller!"

"Precisely. Entropy determines the compression limit. That's Shannon's source coding theorem."

"Wait," Riku raised his hand. "So completely random data can't be compressed?"

"Exactly. Trying to compress random noise is basically impossible. It's already at maximum entropy."

Yuki gazed at the coin again. "So if this coin is truly random, the result of a toss has 1 bit of entropy..."

"Correct! If heads and tails are equally probable, exactly 1 bit."

"Information theory treats uncertainty mathematically."

Aoi nodded. "Yes. Information is what reduces uncertainty. When you receive a message, entropy decreases. That's the essence of information."

Riku's face became serious. "Then tomorrow, I'll definitely arrive on time."

"You said that yesterday too," Yuki laughed.

"...So my entropy is still high, huh?"

The three's laughter echoed through the after-school club room.