And Statistics 2 - Probability
She invoked : Posterior ∝ Likelihood × Prior Using Markov Chain Monte Carlo (MCMC) —a computational method to sample from complex posterior distributions—she showed that neither guild was entirely wrong. The Drift had a hidden Markov structure : it switched between “tide-like” and “random walk” states at random intervals. The probability of switching was itself a parameter.
The city’s sage, Elara, had studied . The Random Walk to Nowhere Elara began by modeling a single fishing boat’s position over time. In Stat 1, you’d say: The boat’s position after t hours is normally distributed with mean 0 and variance tσ². But Elara knew better. The Drift meant each step’s variance was random itself. probability and statistics 2
She introduced the : Var(Y) = E[Var(Y|X)] + Var(E[Y|X]) The fishermen scratched their heads. She explained: “The total uncertainty of your position comes from two things: the average internal chaos (the Drift’s random variance) plus the uncertainty in the Drift’s mean behavior.” She invoked : Posterior ∝ Likelihood × Prior
The Kalman filter, now robustified, predicted the Drift would reverse direction in 20 minutes. The fleet turned back. The mountain guild, still using their old periodic model, sailed into the surge. They survived, but their nets were shredded. That night, Elara addressed the city: The city’s sage, Elara, had studied
A debate ensued. Elara stepped in. “In Stat 1, you compare point estimates. In Stat 2, you compare entire distributions of belief.”