Short Story ⟡ Informatics

That Odd Feeling Might Be KL Distance

After-school discussions about how probability shapes our understanding of information and uncertainty.

  • #KL divergence
  • #expectation mismatch
  • #subjective probability
  • #model fitting

"Something feels off."

Riku muttered while glaring at his textbook.

"What?" Yuki asked.

"The answer to this problem. It should be correct, but it doesn't sit right."

Aoi looked over with interest. "That 'odd feeling' might be measurable."

"Measure a feeling?"

"KL divergence. We covered it before, right?"

"Oh, KL-senpai," Riku recalled. "But this is a math problem?"

"Human discomfort arises from gaps between expectation and reality. That's what KL divergence quantifies."

Yuki opened her notebook. "So Riku predicted a distribution of answers in his mind, and it diverges from the actual answer distribution?"

"That's philosophical," Riku laughed.

"But accurate," Aoi continued. "The brain constantly creates probability models. When prediction and observation diverge, we perceive discomfort."

"So my brain's model is buggy?"

"Not buggy, but data-insufficient. Update the model and the discomfort disappears."

Yuki gave another example. "I sometimes have this. When a friend reacts differently than expected."

"Same thing," Aoi explained. "You're inferring the other person's probability model. But it's not perfect, so gaps occur."

"The larger D_KL(truth||prediction), the greater the surprise or discomfort."

Riku was convinced. "So discomfort is information-theoretically a normal response?"

"Yes. It's a sensor detecting differences between prediction and observation."

Yuki pondered. "Then what about people who feel no discomfort at all?"

"Either their model is perfect, or they don't update it."

"Both extremes."

Aoi drew a diagram.

"Subjective probability P, objective probability Q. The larger the KL divergence, the greater the cognitive dissonance."

"Cognitive dissonance... difficult term," Riku said.

"Basically, the 'something's weird' feeling."

Yuki clapped her hands. "Can déjà vu be explained this way?"

"Interesting perspective. Déjà vu might be when current situation and memory probability distributions are similar. Small KL divergence."

"Conversely, completely unexpected events?"

"Huge KL divergence. The brain panics."

Riku's face became serious. "So to eliminate this discomfort, I should understand the problem better?"

"Correct. Learning is the process of reducing KL divergence."

"Same as machine learning," Yuki added. "Training narrows the distance between model and truth."

"Humans and machines do the same thing," Aoi nodded.

Riku reread his textbook. "Got it! The formula should be interpreted this way."

"Discomfort gone?"

"Mostly. Still a bit of KL distance though."

The three laughed.

"KL distance is a useful concept," Riku said. "Mathematical language for everyday feelings."

"Information theory's appeal is connecting abstract and concrete," Aoi said.

Yuki wrote in her notebook. "Discomfort = KL distance between expectation and reality."

"Next time I feel discomfort, I'll think about what diverges from expectation."

"That's the first step of critical thinking."

Outside the window, the sunset was setting. Discomfort is a signal for growth. Today again, the three updated their models slightly.