## Abstract

Consider the problem of prediction in a change point regression model. That is, assume a simple linear regression model holds for all x (independent variable) less than γ_{k}, and a different simple linear regression model holds for x #62; γ_{k}. However, γ_{k}, the change point, is unknown but can be one of m possible values (γ_{1} ≤ γ_{2} ≤ ⋯ ≤ γ_{m}). For x = x_{0} #62; γ_{m} we want to predict a future value of the dependent variable. Adaptive predictors (estimators) and unbiased predictors (estimators) are studied. The adaptive estimator is one which estimates by least squares for the kth model, provided the residual sum of squares for the kth model is smallest. Some theoretical justification for the adaptive estimator is given through a decision theory formulation. A small simulation study comparing the adaptive estimator and unbiased estimator is also offered.

Original language | English (US) |
---|---|

Pages (from-to) | 131-138 |

Number of pages | 8 |

Journal | Statistics and Probability Letters |

Volume | 20 |

Issue number | 2 |

DOIs | |

State | Published - May 27 1994 |

## All Science Journal Classification (ASJC) codes

- Statistics and Probability
- Statistics, Probability and Uncertainty

## Keywords

- Bias
- Generalized Bayes estimators
- Mean squared error
- Penalty factor
- Uninformative prior distributions