background preloader

Bayesian

Facebook Twitter

How Bayesian inference works. The joy and martyrdom of trying to be a Bayesian. Some of my fellow scientists have it easy.

The joy and martyrdom of trying to be a Bayesian

They use predefined methods like linear regression and ANOVA to test simple hypotheses; they live in the innocent world of bivariate plots and lm(). Sometimes they notice that the data have odd histograms and they use glm(). The more educated ones use generalized linear mixed effect models. A complete workflow from the initial data massage to model fitting and the output of the results can easily fit one page of an R script. Even if more is needed, it rarely takes more than a second to run.

Fitting psychometric functions using STAN. Intro to Approximate Bayesian Computation (ABC) Many of the posts in this blog have been concerned with using MCMC based methods for Bayesian inference.

Intro to Approximate Bayesian Computation (ABC)

These methods are typically “exact” in the sense that they have the exact posterior distribution of interest as their target equilibrium distribution, but are obviously “approximate”, in that for any finite amount of computing time, we can only generate a finite sample of correlated realisations from a Markov chain that we hope is close to equilibrium. Approximate Bayesian Computation (ABC) methods go a step further, and generate samples from a distribution which is not the true posterior distribution of interest, but a distribution which is hoped to be close to the real posterior distribution of interest. There are many variants on ABC, and I won’t get around to explaining all of them in this blog. The wikipedia page on ABC is a good starting point for further reading.

Basic idea described by a prior , and a forwards-simulation model for the data , defined by . Posteriors vs predictives. Bayesian Probability Is Not Subjective (It Only Seems Like It Is) Is this George?

Bayesian Probability Is Not Subjective (It Only Seems Like It Is)

Logic Start with standard logic; think syllogisms. Frequentists Are Closet Bayesians: Confidence Interval Edition. Illuminating a wide confidence interval Actually, all frequentists and Bayesians are logical probabilists, but if I put that in the title, few would believe it.

Frequentists Are Closet Bayesians: Confidence Interval Edition

A man might call himself an Anti-Gravitational Theorist, a science which describes the belief that gravity is subject to human will. But the very moment that man takes a long walk off a short dock, he’s going to get wet. One can say one believes anything, but in the end we are all Reality Theorists. We’re stuck with what actually is. The official definition For some uncertain proposition, a parameterized probability model is proposed. Nobody ever believes, nor should they believe, the guess. Musicians, drunks, and Oliver Cromwell. Jim Berger gives the following example illustrating the difference between frequentist and Bayesian approaches to inference in his book The Likelihood Principle.

Musicians, drunks, and Oliver Cromwell

Bayes’ Theorem Using Trimmed Trees. Convenient and innocuous priors. Andrew Gelman has some interesting comments on non-informative priors this morning.

Convenient and innocuous priors

Rather than thinking of the prior as a static thing, think of it as a way to prime the pump. … a non-informative prior is a placeholder: you can use the non-informative prior to get the analysis started, then if your posterior distribution is less informative than you would like, or if it does not make sense, you can go back and add prior information. … At first this may sound like tweaking your analysis until you get the conclusion you want. Bayesian First Aid. So I have a secret project.

Bayesian First Aid

Come closer. I’m developing an R package that implements Bayesian alternatives to the most commonly used statistical tests. Yes you heard me, soon your t.testing days might be over! The package aims at being as easy as possible to pick up and use, especially if you are already used to the classical .test functions. Short taxonomy of Bayes factors. January 21, 2014 Filed in: R | Statistics I started to familiarize myself with Bayesian statistics.

short taxonomy of Bayes factors

WinBUGS/JAGS

Writing a Metropolis-Hastings within Gibbs sampler in R for a 2PL IRT model. Last year, Brian Junker, Richard Patz, and I wrote an invited chapter for the (soon to be released) update of the classic text Handbook of Modern Item Response Theory (1996).

Writing a Metropolis-Hastings within Gibbs sampler in R for a 2PL IRT model

The chapter itself is meant to be an update of the classic IRT in MCMC papers Patz & Junker (1999a, 1999b). To support the chapter, I have put together an online supplement which gives a detailed walk-through of how to write a Metropolis-Hastings sampler for a simple psychometric model (in R, of course!). The table of contents is below: I will continue to add to the online supplement over time.

The next few posts will be: Post 10: Over dispersion and multi-core parallelismPost 11: Replacing R with CPost 12: Adaptive tuning of the Metropolis-Hastings proposals I would be grateful for any feedback. Stan - ShinyStan. Analysis & visualization GUI for MCMC ShinyStan provides visual and numerical summaries of model parameters and convergence diagnostics for MCMC simulations.

Stan - ShinyStan

ShinyStan can be used to explore the output of any MCMC program (including but not limited to Stan, JAGS, BUGS, MCMCPack, NIMBLE, emcee, and SAS). ShinyStan is coded in R using the Shiny web application framework(RStudio). Instructions for installing ShinyStan on all platforms. Subjectivity in statistics. Andrew Gelman on subjectivity in statistics: Bayesian methods are often characterized as “subjective” because the user must choose a prior distribution, that is, a mathematical expression of prior information.

The prior distribution requires information and user input, that’s for sure, but I don’t see this as being any more “subjective” than other aspects of a statistical procedure, such as the choice of model for the data (for example, logistic regression) or the choice of which variables to include in a prediction, the choice of which coefficients should vary over time or across situations, the choice of statistical test, and so forth.