Debiasing Concept Bottleneck Models with Instrumental Variables
Concept-based explanation approach is a popular model interpertability tool because it expresses the reasons for a model's predictions in terms of concepts that are meaningful for the domain experts. In this work, we study the problem of the concepts being correlated with confounding information in the features. We propose a new causal prior graph for modeling the impacts of unobserved variables and a method to remove the impact of confounding information using the instrumental variable techniques. We also model the completeness of the concepts set. Our synthetic and real-world experiments demonstrate the success of our method in removing biases due to confounding and noise from the concepts.
READ FULL TEXT