Stochastic Processes & Markov Chains
🎲 Stochastic Processes & Markov Chains
A Stochastic Process is a collection of random variables indexed by time . This is the foundation of Time-Series Analysis, Queueing Theory, and Reinforcement Learning.
🟢 1. Markov Chains (Discrete Time)
The Markov Property
The future depends only on the present, not the past. This “memoryless” property is defined as:
Transition Matrix ()
A matrix where represents the probability of moving from state to state .
- Row Stochastic: Each row must sum to 1 ().
- Chapman-Kolmogorov Equation: .
🟡 2. Equilibrium and Convergence
Stationary Distribution ()
A probability distribution that remains unchanged by the transition matrix. In linear algebra terms, is a left eigenvector of associated with the eigenvalue .
Ergodicity and Limiting Distributions
A Markov chain is ergodic if it is both irreducible (can reach any state from any other) and aperiodic.
- Fundamental Theorem: For an ergodic Markov chain, a unique stationary distribution exists, and the chain will converge to it regardless of the initial state.
🔴 3. Advanced Applications
Hidden Markov Models (HMM)
A doubly stochastic process where the underlying state sequence is “hidden” and only observable through a set of emission probabilities.
- Filtering: Finding the current state given history.
- Viterbi Algorithm: Most likely path of states.
Markov Chain Monte Carlo (MCMC)
Algorithms used to sample from complex distributions by constructing a Markov chain whose stationary distribution is the target distribution.
- Metropolis-Hastings: The most general MCMC algorithm.
- Gibbs Sampling: A special case where we sample from conditional distributions.
Poisson Processes
A continuous-time stochastic process used for modeling the number of events occurring in a fixed interval of time.
- Properties: Independent increments and memoryless inter-arrival times.