Friday, February 02, 2007

The Power of the Bayesian Method: when n < k is not a problem

I have documented here about how priors in the Bayesian methods are used as new data (data augmentation). This is very useful when you have a model that has negative degree of freedom. Normally, we say this model cannot be identified as number of observation n is smaller than number of estimated parameters k. Traditionally, we have to accept that this problem is unsolvable. But with the aid of priors, we can actually increase k number of observations. So now the degree of freedom becomes n+k-k > k. The model is identifiable. I have mentioned about bayesglm here which is exactly developed for this problem. bayesglm is now in R package: arm.

When is it a problem? Normally, when we are dealing with a fixed-effect model, this would be a potential problem. Traditionally, dealing with the fixed effect model, we add dummies into the model. If we have small n, but many individual units a (e.g., countries), we will encounter the problem that n-(k+a) <>. If we have priors for (k+a) parameters, we can identify the model as n + (k+a) - (k+a) > 0.

No comments: