Short Story ⟡ Informatics

Entropy Rate and the Flow of Time

How much does entropy increase with time? Understanding the essential nature of information sources.

  • #entropy rate
  • #stationary process
  • #time series
  • #information source

"Aoi-senpai, how is daily repetition from an entropy perspective?"

Riku asked abruptly.

"That's about stationary processes," Aoi answered.

"Stationary process?" Yuki asked.

"A process whose statistical properties don't change over time. For example, living with the same pattern every day."

Aoi drew a diagram in the notebook.

"But it's not completely the same. There are gradual changes."

"That's where entropy rate becomes important."

"Entropy rate?"

"Amount of entropy increase per unit time. H'(X) = lim_{n→∞} H(X_n|X_1,...,X_{n-1})/n"

Riku looked confused. "Too complex."

"Simply put, how much new information is born per second."

Yuki began to understand. "Related to predictability?"

"Exactly. Lower entropy rate means easier prediction."

Aoi gave an example.

"Weather has some patterns. Days after sunny are often sunny."

"Markov chain?" Yuki said.

"Yes. Knowing the current state, you don't need to know all the past."

"Efficient."

"Entropy rate quantifies this Markov property."

Riku asked from another angle. "So what's my life's entropy rate?"

Aoi laughed. "Might be high. Because unpredictable."

"Is that a compliment?"

"Could go either way. High entropy rate means rich in change."

Yuki thought. "But too low would be boring, right?"

"Yes. If completely predictable, entropy rate is zero. Nothing new happens."

Aoi wrote an equation.

"H'(X) ≤ H(X). Equality only for independent case."

"With correlation, entropy rate decreases."

"Means we can learn from the past?" Yuki confirmed.

"Exactly. Knowing the past reduces future uncertainty."

Riku got excited. "So studying history is to lower entropy rate?"

"Interesting interpretation," Aoi admitted. "Finding patterns increases predictability."

"But," Yuki said worriedly, "relying too much on patterns makes us unable to handle new changes."

"Overfitting," Aoi pointed out. "Same problem in machine learning."

"Overfitting?"

"Memorizing training data patterns too much, becoming unable to handle new data."

Riku understood. "Balance is important."

"Yes. Appropriate regularity and appropriate change."

Aoi looked out the window.

"Life too, entropy rate might be important."

"What do you mean?" Yuki asked.

"Too low is boring, too high is chaos. Appropriate rate creates fulfillment."

Riku nodded. "A little bit new each day."

"All new things is exhausting, but all same things is boring."

Yuki summarized in her notebook. "Entropy rate is balance of change and stability."

"Beautiful summary," Aoi said.

"Information theory," Riku said, "can also be a life guide."

"A tool is a tool, but insights can be gained."

Aoi added finally.

"Entropy rate represents the essence of information source. Humans are also a kind of information source."

"What information we transmit," Yuki pondered.

"And at what rate we generate new information."

"I'm high rate," Riku said.

"I'm medium rate," Yuki continued.

"I'm..." Aoi pondered. "Adaptive rate that changes with situation, maybe."

The three laughed.

Time keeps flowing. Entropy keeps increasing. But maybe we can choose that rate.

Today too, they generated a new bit.