Trends and Trajectories for Explainable, Accountable and Intelligible Systems: An HCI Research Agenda

Authors
Ashraf Abdul, Jo Vermeulen, Danding Wang, Brian Y. Lim, Mohan Kankanhalli.
Presented by
Ashraf Abdul

In Proceedings of CHI 2018: The ACM SIGCHI Conference on Human Factors in Computing Systems, Montréal, Canada, April 21–26, 2018, 18 pages.
[25.8% acceptance; 2590 submissions]

Abstract

Advances in artificial intelligence, sensors and big data management have far-reaching societal impacts. As these systems augment our everyday lives, it becomes increasing-ly important for people to understand them and remain in control. We investigate how HCI researchers can help to develop accountable systems by performing a literature analysis of 289 core papers on explanations and explaina-ble systems, as well as 12,412 citing papers. Using topic modeling, co-occurrence and network analysis, we mapped the research space from diverse domains, such as algorith-mic accountability, interpretable machine learning, context-awareness, cognitive psychology, and software learnability. We reveal fading and burgeoning trends in explainable systems, and identify domains that are closely connected or mostly isolated. The time is ripe for the HCI community to ensure that the powerful new autonomous systems have intelligible interfaces built-in. From our results, we propose several implications and directions for future research to-wards this goal.

Preview Video

Downloads

Paper