I'm in a statistics class this semester. On the first day, the professor gave an argument for Bayesianism: frequentist probabilities are only defined when events can be viewed as members of a series of experiments with a limiting frequency, but we also sometimes want to talk about the probability of events which can't be framed in that way. For example, Obama being re-elected is a singular event, so we would have difficulty framing it as one of a sequence of experiments. Bayesianism extends the notion of probability to such cases.
Since then, of course, I've been bringing up a few Bayesian points in class when relevant. On his end, the prof goes so far as to point out that Bayes Law is not strictly necessary whenever he uses it, and work through the problem a second time avoiding Bayes Law.
This makes me want to write about a few things.
- Higher Moments
- The square and the second moment
- Least-square fitting
- p-testing and the sort of refutation which Bayesianism is capable of
- covariance vs mutual information, variance vs entropy
- Information theory (& coding theory) as a foundation for subjective probability
However, I have too little time to write about these things. :p If I take the time later, I will turn these bullet points into links.