Short Story ⟡ Informatics

How to Optimize Ambiguous Relationships

A story understanding the gap between expectation and reality through KL divergence.

  • #KL divergence
  • #expectation
  • #reality
  • #optimization

"Riku, you were late again."

Yuki said in an exasperated voice.

"Sorry. But it's only sometimes, right?"

Aoi interjected. "Not sometimes. Statistically twice a week."

"Harsh!" Riku protested.

"This is KL divergence," Aoi said quietly.

"What's that?" Yuki asked.

"A metric measuring divergence between expectation and reality. Formally, Kullback-Leibler divergence."

Aoi wrote an equation on the whiteboard.

"D_KL(P||Q) = Σ P(x) log(P(x)/Q(x))"

"P is the true distribution, Q is the model distribution."

Yuki thought. "In Riku's case, what's P and what's Q?"

"P: Actual lateness pattern Q: Riku's claimed 'sometimes' distribution"

Riku smiled wryly. "So there's a gap?"

"Yes. The gap between expectation and reality is KL divergence."

Yuki wrote in the notebook. "If zero, perfect match?"

"Correct. Zero KL divergence means model and reality are the same."

"But perfect match is difficult."

Aoi nodded. "So we optimize. Minimize KL divergence."

Riku asked seriously. "How?"

"Bring the model closer to reality. Or bring reality closer to the model."

Yuki became interested. "There are two directions?"

"Yes. In Riku's case, 'reduce lateness' or 'lower expectations.'"

"The latter is impossible," Riku answered immediately.

"Then you must change behavior," Aoi laughed.

"Can KL divergence be used for human relationships?" Yuki asked.

"It can. Expectations of others versus actual behavior. That gap creates stress."

Aoi drew a diagram.

"Expectations too high → Large KL divergence → Disappointment Expectations too low → Small KL divergence → Surprise Appropriate expectations → Small KL divergence → Stability"

Yuki understood. "Accurately understanding others is important."

"Exactly. Gather information and update the model."

Riku pondered. "But perfect understanding is impossible, right?"

"That's right," Aoi admitted. "KL divergence can't be made zero."

"But it can be reduced."

Yuki asked. "What's the problem with large KL divergence?"

"Wrong predictions, wasted resources, misunderstandings."

"In machine learning too, KL divergence is used as a loss function."

Riku looked outside. "The KL divergence between me and Aoi-senpai seems large."

"That's certain," Aoi laughed. "But it's decreasing."

"Really?"

"Yes. We understand each other better than a year ago."

Yuki smiled. "You're optimizing your ambiguous relationship."

"Good expression. Relationship optimization is KL divergence reduction."

Riku said seriously. "Then I'll make an effort to reduce lateness."

"Expectation value update," Aoi acknowledged. "Similar to Bayesian inference too."

Yuki closed the notebook. "Expectation and reality don't perfectly match, but can be brought closer."

"That's the essence of optimization," Aoi answered.

Riku laughed. "Information theory helps with human relationships too."

"Information theory is the mathematics of relationships."

The three quietly left the classroom. Gradually optimizing ambiguous relationships.

The journey to reduce KL divergence continues.