Kalman and Bayes average grades
This post will look at the problem of updating an average grade as a very simple special case of Bayesian statistics and of Kalman filtering.
Suppose you're keeping up with your average grade in a class, and you know your average after n tests, all weighted equally.
m = (x1 + x2 + x3 + ... + xn) / n.
Then you get another test grade back and your new average is
m' = (x1 + x2 + x3 + ... + xn + xn+1) / (n + 1).
You don't need the individual test grades once you've computed the average; you can instead remember the averagem and the number of gradesn [1]. Then you know the sum of the firstn grades isnm and so
m' = (nm + xn+1) / (n + 1).
You could split that into
m' = w1 m + w2 xn+1
where w1 = n/(n + 1) and w2 = 1/(n + 1). In other words, the new mean is the weighted average of the previous mean and the new score.
A Bayesian perspective would say that your posterior expected grade m' is a compromise between your prior expected gradem and the new data xn+1. [2]
You could also rewrite the equation above as
m' =m + (xn+1 - m)/(n + 1) = m + K
where K = 1/(n + 1) and = xn+1 - m. In Kalman filter terms,K is the gain, the proportionality constant for how the change in your state is proportional to the difference between what you saw and what you expected.
Related posts- A Bayesian view of Amazon Resellers
- Kalman filters and functional programming
- Kalman filters and bottom-up learning
[1] In statistical terms, the mean is a sufficient statistic.
[2] You could flesh this out by using a normal likelihood and a flat improper prior.
The post Kalman and Bayes average grades first appeared on John D. Cook.