Oregon State University



Event Details

Colloquium: What should be in an XAI explanation? What IFT reveals by studying RTS

Monday, February 19, 2018 4:00 PM - 4:50 PM

Jon Dodge, Ph.D. Student
School of EECS, Oregon State University

"What should be in an explanation and what should they look like?” This is a fundamental question to answer in order for Explainable Artificial Intelligience (XAI) to gain the trust of human assessors. To this end, we conducted a pair of studies investigating generation, content, and form of explanations in the Real-Time Strategy (RTS) domain, specifically StarCraft II. First, we observed expert explainers' (shoutcasters) foraging patterns and speech, as they provide explanations in real-time. Second, we used a lab study to examine how participants investigated agent behavior in the same domain - but without the real-time constraint. By conducting this pair of studies, we are able to study both (1) explanations supplied by experts and (2) explanations demanded by assessors. Throughout our studies, we adopted an Information Foraging Theory (IFT) perspective, which allows us to generalize our results. In this talk, we present what these results tell us about how to explain AI systems.

Linus Pauling Science Center (campus map)
1 541 737 3617
Sch Elect Engr/Comp Sci
This event appears on the following calendars: