Add norm param to ResNet
Fixes #7294 .
Description
Adds a norm param to ResNet
Types of changes
- [x] Non-breaking change (fix or new feature that would not break existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing functionality to change).
- [x] New tests added to cover the changes.
- [x] Integration tests passed locally by running
./runtests.sh -f -u --net --coverage. - [x] Quick tests passed locally by running
./runtests.sh --quick --unittests --disttests. - [x] In-line docstrings updated.
- [x] Documentation updated, tested
make htmlcommand in thedocs/folder.
Hi @Pkaps25 thanks for this, is it ready to review? We should add at least one test for the new argument in Daf3D however.
Hi @Pkaps25 thanks for this, is it ready to review? We should add at least one test for the new argument in Daf3D however.
Daf3D doesn't currently have a norm parameter and it looks like adding that in might be tricky since certain parts of the network expect group norm and others use ResNet's batch norm. I'd like to avoid adding that param in this PR, so short of that, would the existing tests cover my change?
Hi @Pkaps25, after we discussed from the dev meeting. One thing I overlooked in this PR is whether the change of the name of act will affect the weights loading of the original model, can you help check that, if so, maybe we need to revert the name change for the variable and add an alias.
Thanks!
Ah, nice catch. I will take a look.
On Tue, May 14, 2024 at 11:56 PM YunLiu @.***> wrote:
Hi @Pkaps25 https://github.com/Pkaps25, after we discussed from the dev meeting. One thing I overlooked in this PR https://github.com/Project-MONAI/MONAI/pull/7749/files is whether the change of the name of act will affect the weights loading of the original model, can you help check that, if so, maybe we need to revert the name change for the variable and add an alias.
Thanks!
— Reply to this email directly, view it on GitHub https://github.com/Project-MONAI/MONAI/pull/7752#issuecomment-2111533032, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKNCEMEWE4G2GVSQNSN5GMTZCLMFXAVCNFSM6AAAAABHPJ56SWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJRGUZTGMBTGI . You are receiving this because you were mentioned.Message ID: @.***>
Hi @Pkaps25, after we discussed from the dev meeting. One thing I overlooked in this PR is whether the change of the name of act will affect the weights loading of the original model, can you help check that, if so, maybe we need to revert the name change for the variable and add an alias.
Thanks!
Hi @KumoLiu, I trained ResNet with latest MONAI and loaded the model state using my branch. The model loaded successfully, had the same layers, and a forward pass with a random input produced the same result on both branches. Let me know if that's a sufficient check and if we need to add tests for this compatibility in the future
Hi @Pkaps25, after we discussed from the dev meeting. One thing I overlooked in this PR is whether the change of the name of act will affect the weights loading of the original model, can you help check that, if so, maybe we need to revert the name change for the variable and add an alias. Thanks!
Hi @KumoLiu, I trained ResNet with latest MONAI and loaded the model state using my branch. The model loaded successfully, had the same layers, and a forward pass with a random input produced the same result on both branches. Let me know if that's a sufficient check and if we need to add tests for this compatibility in the future
Thanks to the quick verification, and make sense it works since the activation function usually has no parameters to learn in the neural network. I guess we don't need to add a test for this. But if the norm layer name changed, it should be worth to add the test. What do you think? cc @ericspod
Hi @Pkaps25, after we discussed from the dev meeting. One thing I overlooked in this PR is whether the change of the name of act will affect the weights loading of the original model, can you help check that, if so, maybe we need to revert the name change for the variable and add an alias. Thanks!
Hi @KumoLiu, I trained ResNet with latest MONAI and loaded the model state using my branch. The model loaded successfully, had the same layers, and a forward pass with a random input produced the same result on both branches. Let me know if that's a sufficient check and if we need to add tests for this compatibility in the future
Thanks to the quick verification, and make sense it works since the activation function usually has no parameters to learn in the neural network. I guess we don't need to add a test for this. But if the norm layer name changed, it should be worth to add the test. What do you think? cc @ericspod
Agreed that if the name changes we should add that test, but I don't change the name in this PR.
Thanks to the quick verification, and make sense it works since the activation function usually has no parameters to learn in the neural network. I guess we don't need to add a test for this. But if the norm layer name changed, it should be worth to add the test. What do you think? cc @ericspod
This looks good then, I thought that a layer with no parameters wouldn't show up in the state dict but it's best to double check.
/build
/build