Abstract
Consider the problem of prediction in a change point regression model. That is, assume a simple linear regression model holds for all x (independent variable) less than γk, and a different simple linear regression model holds for x #62; γk. However, γk, the change point, is unknown but can be one of m possible values (γ1 ≤ γ2 ≤ ⋯ ≤ γm). For x = x0 #62; γm we want to predict a future value of the dependent variable. Adaptive predictors (estimators) and unbiased predictors (estimators) are studied. The adaptive estimator is one which estimates by least squares for the kth model, provided the residual sum of squares for the kth model is smallest. Some theoretical justification for the adaptive estimator is given through a decision theory formulation. A small simulation study comparing the adaptive estimator and unbiased estimator is also offered.
Original language | English (US) |
---|---|
Pages (from-to) | 131-138 |
Number of pages | 8 |
Journal | Statistics and Probability Letters |
Volume | 20 |
Issue number | 2 |
DOIs | |
State | Published - May 27 1994 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty
Keywords
- Bias
- Generalized Bayes estimators
- Mean squared error
- Penalty factor
- Uninformative prior distributions