Christine Bauer,
"Multi-Method Evaluation: Leveraging Multiple Methods to Answer What You Were Looking For"
: Proceedings of the 2020 Conference on Human Information Interaction and Retrieval (CHIIR ?20), Seite(n) 472-474, 3-2020
Original Titel:
Multi-Method Evaluation: Leveraging Multiple Methods to Answer What You Were Looking For
Sprache des Titels:
Englisch
Original Buchtitel:
Proceedings of the 2020 Conference on Human Information Interaction and Retrieval (CHIIR ?20)
Original Kurzfassung:
Research in the field of information retrieval and recommendation
mostly focuses on one single evaluation method and one single
quality objective. On the one hand, many research endeavors focus
on system-centric evaluation from an algorithmic perspective
and consider the context of use only to a minor extent. On the
other hand, there are research endeavors focusing on user-centric
approaches to the design and evaluation of systems. However, algorithmic
quality and perceived quality of user experience do not
necessarily match. Thus, it is essential for system evaluation to
substantially integrate multiple evaluation methods that cover a variety
of relevant aspects and perspectives. Only such an integrated
combination of methods may lead to a deep understanding of users,
their behavior, and experience in their interaction with a system.
This half-day tutorial follows the objective to raise awareness
in the CHIIR community concerning the significance of using multiple
methods in the evaluation of information retrieval and recommender
systems. The tutorial illustrates the ?blind spots? when
using single methods. It introduces the concept of ?multi-method
evaluation? and discusses its benefits and challenges. While multimethod
evaluations may be designed very flexibly, the tutorial
presents broadly-defined basic options of how multiple methods
may be integrated in an evaluation design. In group work, participants
are encouraged to select and fine-tune a specific design that
best matches their research endeavor?s purpose.