Abstract
Expert systems routinely use conditional reasoning. Conditionally specified statistical models offer several advantages over joint models; one is that Gibbs sampling can be used to generate realizations of the model. As a result, full conditional specification for multiple imputation is gaining popularity because it is flexible and computationally straightforward. However, it would be restrictive to require that every regression/classification must involve all of the variables. Feature selection often removes some variables from the set of predictors, thus making the regression local. A mixture of full and local conditionals is referred to as a partially collapsed Gibbs sampler, which often achieves faster convergence due to reduced conditioning. However, its implementation requires choosing a correct scan order. Using an invalid scan order will bring about an incorrect transition kernel, which leads to the wrong stationary distribution. We prove a necessary and sufficient condition for Gibbs sampling to correctly sample the joint distribution. We propose an algorithm that identifies all of the valid scan orders for a given conditional model. A forward search algorithm is discussed. Checking compatibility among conditionals of different localities is also discussed.
Original language | English (US) |
---|---|
Pages (from-to) | 171-180 |
Number of pages | 10 |
Journal | Journal of Multivariate Analysis |
Volume | 167 |
DOIs | |
State | Published - Sep 2018 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Numerical Analysis
- Statistics, Probability and Uncertainty
Keywords
- Dependence network
- Faster convergence
- Multiple imputation
- Non-full conditional specification
- Partially collapsed Gibbs sampler
- Valid scan order