captum icon indicating copy to clipboard operation
captum copied to clipboard

AssertionError: Elements to be reduced can only be either Tensors or tuples containing Tensors.

Open soloccc opened this issue 4 years ago • 3 comments

when I follow the quick start, using pytorch with a two layer model(linear-relu-linear)

In this sentence: cond_vals = cond.attribute(test_input_tensor, target=1)

the error occured. I have checked the type of test_input_tensor----tensor,so I don`t know how to slove it.

soloccc avatar Aug 31 '21 05:08 soloccc

Hi @soloccc,

In order to better help you, we need to look into your source code. Could you share snippets showing how you define your model (including the forward function), and how you are using Captum?

bilalsal avatar Aug 31 '21 18:08 bilalsal

Hi @soloccc,

In order to better help you, we need to look into your source code. Could you share snippets showing how you define your model (including the forward function), and how you are using Captum?

thx:) codes are as follows

class twolayer(nn.Module):
    def init(self, input_data, hidden_layer, output_data):
    super(twolayer, self).init()
    self.linear1 = nn.Linear(input_data, hidden_layer)
    self.ReLU = nn.ReLU()
    self.sigmoid = nn.Sigmoid()
    self.linear2 = nn.Linear(hidden_layer, output_data)

    self.bn_in = nn.BatchNorm1d(input_data)
    self.bn1 = nn.BatchNorm1d(hidden_layer)
    self.bn2 = nn.BatchNorm1d(output_data)

def forward(self, x):
    x = self.bn_in(x)
    a1 = self.linear1(x)

    self.ReLU(a1)
    a1 = self.bn1(a1)

    y_pred = self.linear2(a1)
    y_pred = self.bn2(y_pred)
    return y_pred

I use it with pytorch. I try to use it in kddcup99(IDS dataset) to see if there is any new idea for a security application for cybersecurity

soloccc avatar Sep 01 '21 15:09 soloccc

Hi @soloccc , apologies for the missing to respond earlier!

You might be facing the same issue as in #445. Can you make sure both the forward function and autograd work independently of Captum? To test autograd, you can execute the following:

with torch.autograd.set_grad_enabled(True):
    model_out = model_forward(test_input_tensor)
    grads = torch.autograd.grad(torch.unbind(selected_out), test_input_tensor)

Hope this helps

bilalsal avatar Dec 15 '21 22:12 bilalsal