State-of-the-art radio frequency transceivers for mobile communication devices suffer from transmitter-to-receiver (Tx-Rx) leakage in frequency division duplex operation. Although the duplexing distance between Tx and Rx carriers prohibits direct down-conversion of this leakage, transmit signal components might fall into the Rx baseband because of further non-idealities in the analog front-end and thus degrade performance. These so called self-interference issues have been addressed in literature and several countermeasures have been proposed. Two of them are digital and mixed-signal mitigation architectures. In this work, we analyze the problem from a system level perspective and enable a fair comparison of the existing solutions by making use of an extensive simulation framework that covers many relevant factors for the task of interference cancellation. Furthermore, the computational complexity of the estimation algorithms and other relevant aspects for real-time processing are discussed. This study reveals that choosing a suitable adaptive algorithm and optimizing it for the application is as crucial as the cancellation architecture itself.