Short Story ⟡ Informatics

Measuring the World by Amount of Surprise

The relationship between entropy and surprise, understanding daily events through information content.

  • #entropy
  • #surprise
  • #information content
  • #daily events

"I'm surprised!"

Riku rushed into the club room.

"About what?" Yuki asked.

"Got a winning ticket from the vending machine!"

Aoi looked up with interest. "What's the probability?"

"Don't know. But I'm really surprised."

"That surprise is information content," Aoi said.

"What?"

Yuki opened her notebook. "The relationship between surprise and information content?"

"Yes. Rarer events have more information content."

Riku sat down. "What does that mean?"

Aoi explained. "If I say 'the sun rose', you're not surprised, right?"

"Because it's obvious."

"Yes. Information content is nearly zero. But what if 'the sun rose from the west'?"

"I'd be very surprised!"

"That's high information content."

Yuki understood. "Unexpectedness determines information content."

Professor S. entered the club room. "Good discussion."

"Professor, please teach us about surprise and information content."

The professor nodded. "Information content is inversely proportional to the logarithm of event probability."

"In formula terms?" Aoi supplemented. "I(x) = -log P(x)"

"The smaller probability P, the larger information content I."

Riku thought. "Then things completely predictable have zero information content?"

"Yes. Events with probability 1 carry no information."

Yuki thought of an example. "Coming to school every day has little information content."

"But if you suddenly miss?"

"High information content. Everyone thinks 'why?'"

The professor continued. "Entropy is average surprise."

"Average surprise?"

"In a situation where various events can occur, the expected information content."

Aoi drew a diagram. "Coin flip: Heads 50%, Tails 50% Average information content = 1 bit"

"But with a biased coin?"

"Heads 90%, Tails 10% has less average information content."

Riku understood. "Easier to predict means less surprise."

"Yes. Entropy is a measure of uncertainty."

Yuki asked. "Then what's life's entropy?"

The professor smiled. "Interesting question. Differs by person."

"How does it differ?"

"People living predictable lives have low entropy. People with surprising days have high entropy."

Riku laughed. "I might be high entropy."

"But neither is necessarily better," the professor supplemented.

"Low entropy is stable. High entropy is stimulating."

Yuki took notes. "A matter of choice."

Aoi offered another perspective. "But true surprise comes from outside the model."

"Outside the model?"

"Outside the range of expectation. Events like 'I never even thought of that'."

The professor nodded. "Those have the highest information content."

"New knowledge, new perspectives. They change worldview."

Riku became serious. "Learning information theory was also unexpected."

"A high information content experience," Aoi acknowledged.

Yuki laughed. "At first I thought 'Information Theory Club? What's that?'"

"But now?"

"My way of seeing the world changed."

The professor said as he left, "That's the essence of learning. Growing through surprise."

"High information content experiences change people."

The three thought quietly. Surprises in daily life. Small unexpectedness.

The measure for those is information content.

Riku said. "Then how many bits of surprise will tomorrow have?"

"Don't know. That's why it's interesting," Aoi answered.

Yuki summarized. "Measuring the world by amount of surprise:

  1. Rare events are high information content
  2. Entropy is average surprise
  3. Life's richness is total amount of surprise
  4. Learning is high information content experience"

Outside the window, sunset descends. A predictable event. Information content is zero.

But that beauty cannot be measured.

Information theory isn't omnipotent. But it's one ruler for measuring the world.