Thomas Paireder, Christian Motz, Mario Huemer,
"Normalized stochastic gradient descent learning of general complex?valued models"
, in IET Electronics Letters, Vol. 57, Nummer 12, Wiley, Seite(n) 493-495, 6-2021, ISSN: 1350-911X
Original Titel:
Normalized stochastic gradient descent learning of general complex?valued models
Sprache des Titels:
Englisch
Original Kurzfassung:
The stochastic gradient descent (SGD) method is one of the most prominent first?order iterative optimisation algorithms, enabling linear adaptive filters as well as general nonlinear learning schemes. It is applicable to a wide range of objective functions, while featuring low computational costs for online operation. However, without a suitable step?size normalisation, the convergence and tracking behaviour of the stochastic gradient descent method might be degraded in practical applications. In this letter, a novel general normalisation approach is provided for the learning of (non?)holomorphic models with multiple independent parameter sets. The advantages of the proposed method are demonstrated by means of a specific widely?linear estimation example.