Robust Feature-Level Adversaries are Interpretability Tools

Bibliographic Details
Title: Robust Feature-Level Adversaries are Interpretability Tools
Authors: Casper, Stephen, Nadeau, Max, Hadfield-Menell, Dylan, Kreiman, Gabriel
Publication Year: 2021
Collection: Computer Science
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Computer Vision and Pattern Recognition
More Details: The literature on adversarial attacks in computer vision typically focuses on pixel-level perturbations. These tend to be very difficult to interpret. Recent work that manipulates the latent representations of image generators to create "feature-level" adversarial perturbations gives us an opportunity to explore perceptible, interpretable adversarial attacks. We make three contributions. First, we observe that feature-level attacks provide useful classes of inputs for studying representations in models. Second, we show that these adversaries are uniquely versatile and highly robust. We demonstrate that they can be used to produce targeted, universal, disguised, physically-realizable, and black-box attacks at the ImageNet scale. Third, we show how these adversarial images can be used as a practical interpretability tool for identifying bugs in networks. We use these adversaries to make predictions about spurious associations between features and classes which we then test by designing "copy/paste" attacks in which one natural image is pasted into another to cause a targeted misclassification. Our results suggest that feature-level attacks are a promising approach for rigorous interpretability research. They support the design of tools to better understand what a model has learned and diagnose brittle feature associations. Code is available at https://github.com/thestephencasper/feature_level_adv
Comment: NeurIPS 2022, code available at https://github.com/thestephencasper/feature_level_adv
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2110.03605
Accession Number: edsarx.2110.03605
Database: arXiv
More Details
Description not available.