SEGAN icon indicating copy to clipboard operation
SEGAN copied to clipboard

Shouldn't ref_batch be inside the loop?

Open keunwoochoi opened this issue 6 years ago • 0 comments

Currently, the ref_batch is sampled only once before the training loop starts as here in line 32. Shouldn't it be randomly sampled for each batch? I am guessing it based on its implementation (didn't read the virtual bn paper or anything else) but the comments sound like it should be. What do you think?

    def reference_batch(self, batch_size):
        """
        Randomly selects a reference batch from dataset.
        Reference batch is used for calculating statistics for virtual batch normalization operation.

        Args:
            batch_size(int): batch size

        Returns:
            ref_batch: reference batch
        """
        ref_file_names = np.random.choice(self.file_names, batch_size)
        ref_batch = np.stack([np.load(f) for f in ref_file_names])

        ref_batch = emphasis(ref_batch, emph_coeff=0.95)
        return torch.from_numpy(ref_batch).type(torch.FloatTensor)

keunwoochoi avatar Mar 09 '19 03:03 keunwoochoi