Update self.log
🐛 Bug
Some tutorials still return non-detached tensors in training_step - this is deprecated in 1.6 and may cause memory leaks if people follow those patterns.
e.g. output = OrderedDict({"loss": g_loss, "progress_bar": tqdm_dict, "log": tqdm_dict})
The error pops up in the autogenerated docs here: https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/basic-gan.html
I raised this in https://github.com/PyTorchLightning/lightning-bolts/issues/793 too.
I see the examples in https://github.com/PyTorchLightning/pytorch-lightning/tree/master/pl_examples already defer to lightning bolts for more robust examples - wherever a good source of docs/best practices should be, I think this specific error be fixed? (these examples are also discussed in https://github.com/PyTorchLightning/lightning-tutorials/issues/71).
Thanks, loving the library btw! :)
Hi @loodvn are you interested in sending out fix? :]
Unfortunately not, sorry - https://github.com/PyTorchLightning/lightning-bolts/issues/793#issuecomment-1014226594