Estimation theory is a key enabler in many of today's applications. Estimation can be performed in a classical or in a Bayesian framework. In classical estimation the parameter vector to be estimated is considered to be deterministic. Differently, Bayesian estimators consider the parameter vector to be random, which allows to include prior knowledge in form of statistics of the parameter vector.
To achieve optimal or near-to-optimal performance, every available information about the underlying system model should be incorporated in the estimation process. In this work, we mainly use the so-called linear model with known statistics about the measurements. Ultimately, however, additional model knowledge is present in many applications. We propose several knowledge-aided classical estimators that incorporate such additional model knowledge in an optimal way. These optimal knowledge-aided estimators are compared to estimators that incorporate the additional model knowledge in an intuitive manner. It will be shown that the proposed estimators significantly outperform the intuitive estimators as well as state-of-the-art estimators in many applications.
We also derive adaptive filters that incorporate such model knowledge in an optimal way. These knowledge-aided adaptive filters are compared with intuitive as well as state-of-the-art adaptive filters, where again a significant performance boost is achieved in many scenarios.
Another difference between the classical and Bayesian approaches is the considered unbiased constraint. The unbiased constraint utilized by state-of-the-art Bayesian estimators is weaker than that of unbiased classical estimators. In this work we investigate novel Bayesian estimators fulfilling the so called component-wise conditionally unbiased constraints. The effects of these unbiased constraints, the relation to other Bayesian estimators and the ability to incorporate statistics about the unknown parameter vector are discussed.