"How are stories born?"
Yuki suddenly said.
"Philosophical," Aoi showed interest.
"No, I was thinking information-theoretically. Can stories be treated as stochastic processes?"
Professor S passed by and stopped. "Interesting perspective."
"Professor!"
"Stories can be generated probabilistically. Using Markov chains."
Riku tilted his head. "Markov?"
"A probability model where the current state determines the next state," Aoi explained.
The professor drew on the whiteboard. "For example, after 'sunny,' 70% 'sunny' next, 30% 'rain.'"
"This is state transition."
"Stories are the same. From the state 'protagonist is at school,' next is 'go to classroom' or 'go to rooftop.' Decided by probability."
Yuki got excited. "So if we set probabilities, stories can be auto-generated?"
"Theoretically possible. Such algorithms actually exist."
Riku challenged. "Let's try it!"
The professor smiled. "Then, from a simple example. Decide character actions by probability."
Aoi wrote in the notebook. "States: club room, library, cafe"
"Set transition probabilities. From club room to library 50%, to cafe 30%, stay in club room 20%."
"I see," Yuki took out dice.
"I'll roll. Starting in club room."
She rolled the dice. "4. Move to library."
"Next from library. From library, to club room 40%, to cafe 40%, stay in library 20%."
"2. Returned to club room."
Riku laughed. "This is a story? Isn't it monotonous?"
"Yes. Simple Markov chains have no memory. Don't consider past history," the professor pointed out.
"Then what should we do?"
"Use higher-order Markov models. Consider not just immediately before, but several past steps."
Aoi supplemented. "Or hidden Markov models. Have invisible internal states."
"Internal states?" Yuki asked.
"For example, character mood. Internal states like 'happy,' 'sad' influence actions."
The professor drew a diagram. "Internal states transition, generating observable actions."
"Deep," Riku was impressed.
"But," Yuki had a question. "Are stories generated only by probability interesting?"
"Sharp question," the professor acknowledged. "Actually, completely random stories are boring."
"Why?"
"Because you need balance between surprise and acceptance."
Aoi explained. "Too predictable is dull. Completely random is nonsensical."
"Appropriate predictability makes stories interesting."
The professor stated an important point. "Good stories have appropriate entropy rate."
"Entropy rate?"
"Average information per step. Too high is chaos, too low is monotonous."
Yuki understood. "So writers intuitively balance it."
"Yes. Optimizing as probability models."
Riku gave an example. "Mysteries have high surprise, but looking back, everything connects."
"Perfect example," the professor praised. "Foreshadowing changes conditional probabilities."
"Foreshadowing?"
"Getting certain information changes the probability of next developments. That's foreshadowing's function."
Aoi summarized. "P(ending|with foreshadowing) ≠ P(ending|without foreshadowing)"
"Foreshadowing changes the probability distribution."
Yuki was moved. "Stories can be explained by probability theory too."
"Not everything," the professor smiled. "But part of the structure can be captured by probability models."
"So our daily lives are also probabilistically possible stories?" Riku asked.
"Exactly. Life is a continuous stochastic process."
"But can't be completely predicted."
"Right. That's also what makes it interesting."
The four nodded. Probabilistically possible stories. It was a new way of seeing stories, illuminated by information theory.