🎉 CVPRW2024 - Interactive visual explanations

Our work Allowing humans to interactively guide machines where to look does not always improve human-AI team’s classification accuracy got accepted at XAI4CV at CVPR 2024 and will be part of the CVPR 2024 proceedings.

This research is a continuation of our efforts that began with the visual correspondence paper presented at NeurIPS 2022. Spearheaded by Mohammad, who made significant strides in developing an interactive demo where humans could guide image classifiers, this project aimed to explore new dimensions (interactivity) in human-AI collaboration.

Despite our high hopes, our extensive testing revealed that this interactive approach did not enhance the decision-making accuracy as anticipated. While initially disappointing, these results are incredibly valuable to our opinion. They represent one of the first attempts to integrate interactivity into XAI specifically for image classification. Although we did not achieve the intended improvement in performance, the insights gained are meaningful for the field.

We decided it was important to share our findings with the community, laying the groundwork for future research. By publishing our results, we aim to encourage further exploration and development in interactive XAI, confident that it holds the key to improving trust and performance within decision-making tasks.

We can’t wait to demonstrate our interactive interface at CVPR 2024 and discuss our insights with fellow researchers and enthusiasts. Join us to see how we are pushing the boundaries of what’s possible with human-guided AI in visual tasks. See you at CVPR 2024, Seatle, WA!

Categories:

Updated:

Leave a comment