Short Story ⟡ Informatics

Encoding the Ambiguous Future

An exploration of entropy, uncertainty, and how information theory helps us understand the world.

  • #source coding
  • #predictive coding
  • #entropy coding
  • #prediction
  • #adaptive coding

"Can the future be predicted?"

Yuki suddenly asked. Across the counter at Café Shannon, Professor S was quietly brewing coffee.

"Not perfectly. But probabilistically, we can get close," Professor S answered.

Aoi supplemented. "There's a technique called predictive coding. You infer the future from past data and send only the difference."

"Difference?"

"For example, temperature data. Today is 20 degrees, and you can predict tomorrow will be around 20 degrees too. Then maybe 1 bit of information saying 'as predicted' suffices."

Professor S placed a cup down. "But if the prediction is wrong?"

"You send the difference from the prediction. If it's 22 degrees, send the information '+2 degrees.'"

Yuki began to understand. "If the prediction hits, the amount of information to send decreases."

"Exactly. This is the principle of predictive coding," Aoi explained. "It's also used in video compression. Encode only the parts that changed from the previous frame."

"So that's why video files don't get so huge."

Professor S said quietly. "In information theory, we exploit redundancy of the information source. If past and present are correlated, the future is predictable too."

"But if it's completely random?"

"In that case, prediction is useless. Entropy is maximum, and there's no room for compression."

Aoi drew a diagram in the notebook. "So understanding the information source's properties is important. There's a concept called Markov models."

"Markov?"

"A probability model where the current state determines the future. You predict from just the present, not the entire past history."

Yuki asked for an example. "Like what?"

"Weather forecasting. If today is sunny, tomorrow has high probability of being sunny too. Yesterday's weather affects through today."

Professor S added. "Character strings are the same. In English, 'Q' is followed by 'U' with high probability. Using this, you can encode efficiently."

"Adaptive coding," Aoi used the technical term. "Dynamically change codes according to context."

Yuki said excitedly. "So even with the same 26 alphabet letters, the number of bits needed changes depending on the situation!"

"Correct. Consider not just frequency but also context."

Professor S refilled the coffee. "But prediction always involves error. Perfect prediction is impossible."

"Uncertainty can't be eliminated?"

"It can't. But it can be reduced. That's the role of coding theory."

Aoi continued. "There's Shannon's concept of entropy rate. When considering an infinitely long sequence, it's the average information per symbol."

"Long-term average?"

"Yes. Even if there's variation short-term, it stabilizes long-term."

Yuki suddenly thought of something. "Life might be like that too."

Professor S smiled. "Philosophical. Indeed, short-term events are hard to predict, but long-term patterns emerge."

"Encoding the ambiguous future," Yuki murmured.

"Good expression," Aoi acknowledged. "Probabilistically model the uncertain future and encode it efficiently. That's the aesthetics of information theory."

Professor S said quietly. "But there are things that can't be encoded. Completely random things can't be compressed."

"Does that mean they're information-rich?"

"You could say that. Predictability and information content have a tradeoff relationship."

Aoi drank her coffee. "A too-predictable future might be boring."

"Because there's no surprise," Yuki understood.

"Information theory is also a discipline that measures surprise," Professor S concluded.

Outside the café, people walked while predicting their own futures. Each applying their own codes to unperfectly predictable futures.

"Next time, let's talk deeply about entropy," Professor S proposed.

Yuki and Aoi nodded. The journey to the ambiguous future has only just begun.