Exploring Explainability
A Definition, a Model, and a Knowledge Catalogue
- verfasst von
- Larissa Chazette, Wasja Brunotte, Timo Speith
- Abstract
The growing complexity of software systems and the influence of software-supported decisions in our society awoke the need for software that is transparent, accountable, and trust-worthy. Explainability has been identified as a means to achieve these qualities. It is recognized as an emerging non-functional requirement (NFR) that has a significant impact on system quality. However, in order to incorporate this NFR into systems, we need to understand what explainability means from a software engineering perspective and how it impacts other quality aspects in a system. This allows for an early analysis of the benefits and possible design issues that arise from interrelationships between different quality aspects. Nevertheless, explainability is currently under-researched in the domain of requirements engineering and there is a lack of conceptual models and knowledge catalogues that support the requirements engineering process and system design. In this work, we bridge this gap by proposing a definition, a model, and a catalogue for explainability. They illustrate how explainability interacts with other quality aspects and how it may impact various quality dimensions of a system. To this end, we conducted an interdisciplinary Systematic Literature Review and validated our findings with experts in workshops.
- Organisationseinheit(en)
-
PhoenixD: Simulation, Fabrikation und Anwendung optischer Systeme
Fachgebiet Software Engineering
- Externe Organisation(en)
-
Universität des Saarlandes
- Typ
- Aufsatz in Konferenzband
- Seiten
- 197-208
- Anzahl der Seiten
- 12
- Publikationsdatum
- 2021
- Publikationsstatus
- Veröffentlicht
- Peer-reviewed
- Ja
- ASJC Scopus Sachgebiete
- Allgemeine Computerwissenschaft, Allgemeiner Maschinenbau, Strategie und Management
- Elektronische Version(en)
-
https://doi.org/10.1109/RE51729.2021.00025 (Zugang:
Geschlossen)