Sufficient vs. Necessary: Building Trust in AI through Transparency

Czernietzki, Charlotte; Westmattelmann, Daniel; Schewe, Gerhard


Abstract

Organizations increasingly use AI-based systems to enhance decision-making quality and efficiency. To ensure their acceptance, these systems must be trusted, which is challenging due to their black-box nature. This study tackles this issue by investigating transparency’s role in building trust in AI-based systems within organizational settings. To ensure generalizability, we collected quantitative data (N = 978) across two scenarios differing in their degree of process automation (automated vs. augmented). Using structural equation modeling and necessary condition analysis, we analyzed the effect of a multidimensional conceptualization of transparency on trust. Our results demonstrate that the individual transparency dimensions not only positively affect trust but are also indispensable for its formation. Without specific minimum levels of these transparency dimensions, establishing trust in AI-based systems is fundamentally unachievable. This study advances the AI adoption literature by exploring the transparency-trust relationship from both sufficiency and necessity perspectives, thus guiding strategic AI implementation in organizations.

Keywords
AI; automation; augmentation; decision-making; transparency; trust; perceptions



Publication type
Forschungsartikel in Online-Sammlung (Konferenz)

Peer reviewed
Yes

Publication status
Published

Year
2024

Conference
International Conference on Information Systems

Venue
Bangkok

Language
English

Full text