Short Story ⟡ Informatics

Why High-Entropy Days Are Tiring

Exploring the relationship between daily life entropy and fatigue, understanding the mechanism by which uncertainty increases cognitive load.

  • #entropy
  • #uncertainty
  • #cognitive load
  • #decision making
  • #information processing

"I'm so exhausted today."

Riku collapsed onto the club room sofa.

Yuki agreed. "Me too. Even though I didn't do anything."

Aoi asked with interest. "Didn't do anything? Tell me more."

"It was a series of choices from morning," Riku explained. "What to wear, what to eat, which route to take, who to talk to..."

"It was a high-entropy day."

"Entropy?"

Just then, Professor S entered. "Did you call?"

Aoi explained the situation. "They both seem to have decision fatigue."

Professor S said while brewing coffee. "There's a deep relationship between entropy and fatigue."

He began drawing diagrams on the whiteboard.

"Entropy H(X) = -Σ p(x) log p(x) is a measure of uncertainty."

"When there are many options, all with similar probability, entropy is maximum."

Yuki began to understand. "So today was full of unpredictable things?"

"Yes. With routines, entropy is low. The usual clothes, usual breakfast, usual route."

Riku raised his hand. "But aren't routines boring?"

"It's a tradeoff," Professor S answered. "Low entropy is predictable and efficient but information-poor. High entropy is information-rich but processing-intensive."

Aoi supplemented. "The brain works as a predictor. When prediction error is large, it uses energy for correction."

"Prediction error?"

"The gap between expectation and reality. In high-entropy environments, prediction is difficult and errors occur constantly."

Professor S continued. "Decision-making also has energy costs. More options means more information processing."

Yuki wrote in her notebook. "What's the relationship between number of options and entropy?"

"With n equally probable options, H = log₂(n) bits."

"2 choices is 1 bit, 4 choices is 2 bits, 8 choices is 3 bits."

Riku calculated. "So today's clothing choice... with 32 items in my closet, that's 5 bits?"

"If they're all equally likely, yes," Aoi nodded. "But in reality there's bias. You choose favorite clothes with high probability."

"With bias, entropy decreases."

"Correct. That's why using heuristics is easier than completely random choice."

Professor S offered a new perspective. "What's interesting is the relationship between information value and entropy."

"What do you mean?"

"Low-entropy information is predictable with little surprise. High-entropy information is unpredictable with much surprise."

Yuki connected it. "More surprise means more tiring?"

"Cognitively, yes. But surprise is also the source of learning."

Aoi gave an example. "Consider exam studying. If it's all known problems, low entropy makes it easy but you learn little."

"If it's all new problems, high entropy makes it hard but you learn much."

Riku understood. "So moderate difficulty is important."

"Flow state," Professor S said. "When entropy is in the moderate range, people can concentrate best."

Yuki asked. "So to reduce fatigue?"

"Control entropy," Aoi answered. "To save energy for important decisions, routinize small decisions."

"Is that why Steve Jobs wore the same clothes?"

"To avoid decision fatigue," Professor S nodded. "He reduced clothing entropy to zero to focus on creative decisions."

Riku laughed. "Maybe I should get more uniforms tomorrow."

"But if you completely routinize?"

"You get bored. Humans seek moderate uncertainty."

Aoi organized. "Too low entropy is boring, too high entropy is fatiguing. There's an optimal balance."

Professor S distributed coffee. "Information theory provides tools to quantify this balance."

Yuki summarized. "High-entropy days are tiring because the brain processes more information."

"And that fatigue is also the cost of learning and growth."

Riku took a deep breath. "So today's fatigue wasn't wasted."

"Rather, it's evidence of accumulating new experiences."

Professor S said finally. "Tomorrow, try lowering entropy a bit from today. Recovery is also important."

The three nodded. Alternating between high-entropy and low-entropy days. That might be the rhythm of sustainable living.

Outside the window, the sunset was fading. The end of the day was a time when entropy converges.