Abstract
In this paper, we consider the convergence properties of the forgetting factor RLS algorithm in a stationary data environment. More precisely, we study the dependence of the speed of convergence of RLS with respect to the initialization of the input sample covariance matrix and with respect to observation noise level. By obtaining estimates of the settling time, that is, the time required by RLS to converge, we conclude that the algorithm can exhibit a variable performance. Specifically, when the observation noise level is low (high SNR environment), RLS, when initialized with a matrix of small norm, has a very fast convergence. Convergence speed decreases as we increase the norm of the initialization matrix. In a medium SNR environment, the optimum convergence speed of the algorithm is reduced as compared with the previous case, but on the other hand, RLS becomes more insensitive to initialization. Finally, in a low SNR environment, we show that it is preferable to start the algorithm with a matrix of large norm.
Original language | English (US) |
---|---|
Pages (from-to) | 2370 |
Number of pages | 1 |
Journal | IEEE Transactions on Signal Processing |
Volume | 44 |
Issue number | 9 |
State | Published - 1996 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering