captum icon indicating copy to clipboard operation
captum copied to clipboard

Switch from register_full_backward_hooks to tensor hooks

Open vivekmig opened this issue 3 years ago • 0 comments

This switches usage of full backward hooks to instead apply forward hooks which then add tensor backward hooks, as suggested in #914 . We initially did not choose this approach since it may have limitations with backward hooks on modules with multiple tensors as inputs / outputs (each tensor must be called independently in the hook), but all current use-cases within Captum only require a single tensor input / output.

This change allows us to enable in-place modules as well as remove the limitation on neuron input attribution. DeepLift also no longer needs valid module checks, as these are no longer applicable with usage of tensor hooks.

vivekmig avatar Jun 21 '22 15:06 vivekmig