Monday, September 25, 2006

Bayesian GLM

I recently work with Professor Andrew Gelman, Aleks Jakulin and Maria Grazia Pittau on a paper -- a default prior distribution for logistic and other regression models. The R function for the model is bayesglm.R.

Basically, the idea is: when we try to fit a logistic model, if we happen to have a perfect predictor, the model will fail to converge. Our function is to fit this model by setting default prior distribution on coefficients. Below is the abstract of the paper.

We propose a new prior distribution for classical (non-hierarchical) logistic regression models, constructed by first scaling all nonbinary variables to have mean 0 and standard deviation 0.5, and then placing independent Student-t prior distributions on the coefficients. As a default choice, we recommend the Cauchy distribution with center 0 and scale 2.5, which in the simplest setting is a longer-tailed version of the distribution attained by assuming one-half additional success and one-half additional failure in a logistic regression. We implement a procedure to fit generalized linear models in R with this prior distribution by incorporating an approximate EM algorithm into the usual iteratively weighted least squares algorithm. We illustrate with several examples, including a series of logistic regressions predicting voting preferences and an imputation model for a public health dataset.
We recommend this default prior distribution for routine applied use. It has the advantage of always giving answers, even when there is complete separation in logistic regression (a common problem, even when the sample size is large and the number of predictors is small) and also automatically applying more shrinkage to higher-order interactions. This can be useful in routine data analysis as well as in automated procedures such as chained equations for missing-data imputation
Keywords: Bayesian inference, generalized linear models, least squares, linear regression, logistic regression, noninformative prior distribution

No comments: