OMSCS debrief - part 1

ISYE 6420: Bayesian Statistics

For my first course in the OMSCS program, I decided to take Bayesian Statistics:

  • it had open “seats” (and getting in to classes can be challenging—especially when you don’t have many credit hours);1
  • I’d self-studied Bayesian statistics a good amount, so I felt confident that I could do well in the class; and
  • from the OMSCS Central reviews, it didn’t seem like it would be an overwhelming first course (in terms of time or difficulty).2

Prerequisites

Programming. I don’t remember specific prerequisites when I took the class. At that point in time, the homework assignments (when programming was required) could be completed in R, Python, or even MATLAB. Certain assignments required usings the Bayesian inference Using Gibbs Sampling ( BUGS) DSL or one of its cousins (e.g., JAGS).

In other words, there were no hard requirements coming into the class.

Mathematical background. The mathematical prerequisites were light as well: calculus, a first course in probability, and a first course in (mathematical) statistics. I’d mathematical statistics as opposed to an undergraduate applied stats class, but either would be (probably approximately) sufficient.

Content

The course covered

  • Bayes theorem and conditional reasoning;
  • prior distributions and posterior distributions, exploiting conjugacy along the way;
  • computing using Bayesian inference toolboxes;
  • Markov chain Monte Carlo (MCMC) samplers from foundations (e.g., coding a Metropolis sampler and a Gibbs sampler from scratch);
  • hierarchical modeling;
  • Bayesian regression;
  • Bayesian approaches to handling missing data.

The class had more of an applied focus in that modeling was more emphasized than theory (e.g., there wasn’t much concern for convergence rates of MCMC). And as someone who tends to like theory, I found this effective. This was class about using Bayesian methods for real problems; there was sufficient theory for what the class covered, but it wasn’t a theory class.

Assignments

For fall 2019, the assignments consisted of six multi-problem homework sets, a midterm, a project, and a final. The midterm and final were “take-home” assignments with a one-week time limit. Homework assignments spanned two week intervals. The project could be completed over multiple weeks. I found that all of the assignments were relatively balanced in terms of how much time and effort I spent on them.

OMSCS Central-like stats

Difficulty

Conceptual. Having background in the more theoretical side of statistics, I found the class straightforward. That said, Bayesian reasoning and inference can be weird when one first encounters it; I recall some questions and some grumbling about this during the early weeks in the class. Similarly, Bayesians integrate a lot so if the part of your brain that computes integrals hasn’t been exercised, then you might find yourself rebuilding those neural connections.

Practical. The assignments require using some toolboxes (which may have changed since 2019) for Bayesian inference, none of which have difficult APIs (if memory serves). This is not a class where you’ll find yourself buried in loads of code that you or someone else wrote.

Workload

My experiencing-self found the workload to be quite manageable with a full time job and living a balanced life. (By that latter point, I mean that I had plenty of time for sleeping, spending time with my family/friends, exercising etc.) My remembering-self who has the full scope of OMSCS in mind concurs with my experiencing-self.

For me, this class could’ve easily been paired with another course.

Rating

Coming in to OMSCS, I considered myself to be a computationally-inclined statistician more than a statistically-inclined computer scientist. So this class was up my alley and consistent with my interests. I enjoyed it and would recommend it to folks

  • who like stats,
  • want to learn how Bayesians think,
  • want to model things like Bayesians do, and/or
  • want more of a math course than a CS course.

Wrapping up

Bayesian thinking is fun! I think that anyone who reasons about data would do well to have a Bayesian lens at her/his/their disposal for analyzing and interpreting said data. (As an aside: appreciating the Bayesian branch of risk—i.e., the expectation of loss—is a must for statisticians and arguably for ML folks too.) And this class provides just such a lens!

If you have specific questions or comments about the course, please feel free to send me an email: jake@knigge.us.

Next up

… CS 7642: Reinforcement Learning and Decision Making—you can also bring your $\widehat{\theta}$ for that one!

(My goal is to complete the next recap within a month—not nine. 😳)


  1. When registering for classes, students with more credit hours are given higher priority (aka, earlier registration times) than students with less hours. So, in some sense, you take what can get. ↩︎

  2. The reviews on OMSCS Central can be helpful in getting a sense for what your experience in a course could be like. There are helpful reviews, ranting reviews, terse reviews, and thorough reviews. ↩︎

husband | son | brother | statistician masquerading as a computer scientist | mathematical optimizer masquerading as a statistician | locomotor | board-rider | meditator | provider for 🐕 ∧ 🐈‍⬛ ∧ 🐴 | lifetime learner | k🥷🏼

My nerdy interests include statistics, optimization and computation, and how those things manifest in finance, economics, biology, physics… really in any field.