Implement TorchIO transforms wrapper analogous to TorchVision transfo…
…rms wrapper and test case
Fixes #7499 .
Description
As discussed in the issue, this PR implements a wrapper class for TorchIO transforms, analogous to the TorchVision transforms wrapper.
The test cases just check that transforms are callable and that after applying a transform, the result is different from the inputs.
Types of changes
- [x] Non-breaking change (fix or new feature that would not break existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing functionality to change).
- [x] New tests added to cover the changes.
- [ ] Integration tests passed locally by running
./runtests.sh -f -u --net --coverage. - [ ] Quick tests passed locally by running
./runtests.sh --quick --unittests --disttests. - [x] In-line docstrings updated.
- [x] Documentation updated, tested
make htmlcommand in thedocs/folder.
Not sure where I have to add the torchio dependency to make CI pass. All min-dep checks will fail I guess as i put the dependency in requirements-dev.txt, environment-dev.yml and setup.cfg. The premerge-quick should pass.
Hi @SomeUserName1, you can add a skip wrapper like this: https://github.com/Project-MONAI/MONAI/blob/c86e790d56f3a5ad07db6e642c6a65e7779b7b21/tests/test_patch_wsi_dataset.py#L184
Thanks.
One test failed, i have no idea why as the test and corresponding code is nothing that I've touched
Traceback (most recent call last):
File "/home/runner/work/MONAI/MONAI/tests/test_regularization.py", line 73, in test_cutmixd
self.assertTrue(not torch.allclose(output["a"], output["b"]))
AssertionError: False is not true
Looks like a random error, fixed in this commit. https://github.com/Project-MONAI/MONAI/pull/7542/commits/40bb1300e8baccecbe05a61726ba508d27753eb2
@KumoLiu only the blossom-ci check hasn't passed. Can we merge?
Hi @SomeUserName1, thanks for the PR! We might consider incorporating this new feature after the 1.3.1 release due to the new requirements included in this PR.
1. It would be nice to have this available for the dictionary versions as well (like [torchvisiond](https://docs.monai.io/en/stable/transforms.html#torchvisiond))
Implemented.
2. The transform does currently not implement the RandomizableTrait, so I'm not sure how this would interact with CacheDatasets etc. I think it might help to have a version with the RandomizableTrait for all Random transforms (it seems like these are easy to find since they start with "Random") and one without for the rest. I'm not sure how to best design this, but one could maybe use the `__new__()` method to select the correct one based on the name. Alternatively an error/warning could be thrown if the wrong version is used.
They also have a common RandomTransform base class in tio, which differs from MONAIs base RandomTransform.
All tio transforms are probabilistic and support the p= parameter specifying the probability that the transform is applied. This is analogous to RandomizableTransform in MONAI.
The RandomTransform in tio states that there is randomness in the applied transform itself (like RandomNoise), additionally that its application (or not application) is probabilistic.
Thus, I add the RandomizableTrait as ABC to the TorchIO wrapper for all transforms.
I'm not sure if I should add Randomizable or RandomTranform as well as this is implemented in torchio already. So adding these two would be clearer in terms of the capabilities of these transforms but require manipulating TorchIO objects/class instances in MONAI code introducing complexity with little practical value (probabilistic application can be done via the tio parameters; is there a benefit in adding the same functionality via MONAI with the different concepts in mind?)