Applied Stochastic Processes

Term

Winter 2026

Updated

January 23, 2026

From the [course syllabus]:

Stochastic processes are indexed collections of random variables used to describe phenomena in which a dependence structure arises from evolution across time (or space). Markov processes, in particular, are stochastic processes in which dependence is local: given the current state (or values on a separating boundary), the future (outside the boundary) is conditionally independent of the past (or interior history). Markov processes have rich applications in epidemiology, finance, biology, social science, engineering, chemistry, and beyond, and they are also important in statistics. In particular, Markov chain Monte Carlo (MCMC) methods are central to modern Bayesian statistics as a means to approximate complex posterior distributions for Bayesian inference via simulation. This course is a graduate-level introduction to Markov processes and Markov chains, covering four key areas: discrete-time models, continuous-time models, MCMC, and, briefly, Brownian motion and Gaussian processes. Students can expect to learn core concepts and probabilistic language for describing Markov processes, gain exposure to common models and estimation methods, and explore applications.

Instructor: Trevor Ruiz (he/him) [email]

Class meetings: 10:10am–12:00pm MW in 10-124

Office hours: MW 1:00pm–2:30pm and [by appointment] in 25-236 or via Zoom; drop-ins are welcome but appointments are recommended/appreciated.

Week 1 (1/5)

Monday: introduction to Markov chains [notes]

  • [reading] syllabus; 1.2, 2.1

  • [hw1] exercises 2.1, 2.9, 2.10, 2.16, 2.27 (copy of gamblersruin.R), due Monday 1/12

Wednesday: transition probabilities [notes] [codes]

  • [reading] 2.2, 2.3

Week 2 (1/12)

Monday: limiting and stationary distributions [notes]

  • [reading] 2.4, 3.1, 3.2

  • [hw2] exercises 3.7, 3.8, 3.14a-b, 3.22, 3.10, 3.63 (copy of utilities.R), due Tuesday 1/20

  • [activity] exploring limits [R script]

Wednesday: finding stationary distributions [notes] [example]

  • [reading] 3.3 3.2

Week 3 (1/20)

MLK Day observed; Tuesday follows Monday schedule

Tuesday: recurrence, transience, and periodicity [notes]

  • [reading] 3.3, 3.5
  • [hw3] exercises 3.23, 3.29, 3.52, 3.54, 3.66; optionally, 3.64

Wednesday: limit theorem for finite Markov chains [notes]

  • [reading] 3.6, 3.8, 3.10
  • [activity] random walks on \(\mathbb{Z}^d\) [R script]

Week 4 (1/26)

Monday: estimation; hidden Markov models