MLRadfys

Results 15 comments of MLRadfys

Hi Dominik, thanks! It does not affect the functionality though. I just stumbled about it while debugging my code :-) Cheers, Michael

Hi, I think the main purpose is to show the relationship between patch size and resampled volume. Like Dominik wrote in the example: "A good patch shape median image ratio...

Hi again, here you go: ```python from miscnn.data_loading.interfaces.nifti_io import NIFTI_interface from miscnn.data_loading.data_io import Data_IO from miscnn.processing.preprocessor import Preprocessor from miscnn.neural_network.architecture.unet.standard import Architecture from miscnn.neural_network.model import Neural_Network import os import tensorflow...

Thanks for this detailed explanation, yes, I noticed that the cropping and augmentation can become a huge bottleneck when training on the fly. Maybe some multiprocessing could increase the performance....

> This idea is fantastic. :) > I had something like that already in mind when implementing the seed for temporary files in the batches directory. Therefore, it should be...

Hi and thanks for the quick reply! I tried to write some code on my own for a project, but it seems like I have an offset between my dose...

Hi and thanks for providing Grounding Dino as a pip package @giswqs ! I compared the inference output of the original repo with the output of the package, and it...

Hi! Yes, If I remember correctly I reduced both the learning rate and the batch size to 64 :-) Cheers, M

Hi again! I tested running the code with the above command. Unfortunately Iam still having the same issues. It seems like the loss and all other metrices are not changing...

Hi again! Interestingly, when I reduce the learning rate from 0.00025 to 0.00005 it works. Does this have something to do with the distributed training? Thanks again, kind regards, M