Torch-Pruning icon indicating copy to clipboard operation
Torch-Pruning copied to clipboard

[Warning] Unknown operation None encountered, which will be handled as an element-wise op

Open hoangdung3498 opened this issue 2 years ago • 2 comments

when i prune, i get warning:

site-packages/torch_pruning/dependency.py:738: UserWarning: [Warning] Unknown operation None encountered, which will be handled as an element-wise op str(grad_fn))

So my parameters don't reduce (Before Pruning: MACs=313283857959.000000, #Params=55272487.000000 and After Pruning: MACs=313283857959.000000, #Params=55272487.000000).

I try to print some thing in func _trace_computational_graph in class DependencyGraph like this :

    def _trace_computational_graph(self, module2node, grad_fn_root, gradfn2module, reused):

        def create_node_if_not_exists(grad_fn):
            print("grad_fn in dependency :",grad_fn)
            module = gradfn2module.get(grad_fn, None)
            if module is not None \
                and module in module2node \
                    and module not in reused:
                return module2node[module]
            
            print("module in dependency:",module)
            # 1. link grad_fns and modules
            if module is None:  # a new module
                if not hasattr(grad_fn, "name"):
                    # we treat all unknwon modules as element-wise operations by default,
                    # which does not modify the #dimension/#channel of features.
                    # If you have some customized layers, please register it with DependencyGraph.register_customized_layer
                    module = ops._ElementWiseOp(self._op_id ,"Unknown")
                    self._op_id+=1
                    if self.verbose:
                        warnings.warn(
                            "[Warning] Unknown operation {} encountered, which will be handled as an element-wise op".format(
                                str(grad_fn))
                        )
                elif "catbackward" in grad_fn.name().lower():
                    module = ops._ConcatOp(self._op_id)
                    self._op_id+=1
                elif "split" in grad_fn.name().lower():
                    module = ops._SplitOp(self._op_id)
                    self._op_id+=1
                elif "view" in grad_fn.name().lower() or 'reshape' in grad_fn.name().lower():
                    module = ops._ReshapeOp(self._op_id)
                    self._op_id+=1
                else:
                    # treate other ops as element-wise ones, like Add, Sub, Div, Mul.
                    module = ops._ElementWiseOp(self._op_id, grad_fn.name())
                    self._op_id+=1
                gradfn2module[grad_fn] = module

And it show :

grad_fn in dependency : None module in dependency: None grand_fn in Node: None grad_fn in dependency : None grad_fn in dependency : None

It different from other models that i have pruned before when they had value.

I had thís in my code:

    for p in net.parameters():
        p.requires_grad_(True)

So i don't know what wrong. Can anyone help me ??? Thank for any help.

hoangdung3498 avatar Jul 02 '23 09:07 hoangdung3498

@VainF Can you help me ???

hoangdung3498 avatar Jul 03 '23 07:07 hoangdung3498