Duplicate UserWarning Logs for `lightning-sdk` Version Check
🐛 Bug
While using either optimize or StreamingDataset, multiple logs for the same warning is shown. It should be at max one.
To Reproduce
Use Optimize function when your machine's lightning-sdk is not the latest version.
Code sample
# Ideally attach a minimal code sample to reproduce the decried issue.
# Minimal means having the shortest code but still preserving the bug.
Expected behavior
Additional context
Environment detail
- PyTorch Version (e.g., 1.0):
- OS (e.g., Linux):
- How you installed PyTorch (
conda,pip, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
It seems that the lightning_sdk does not have a warning cache. I believe that adding something similar to the WarningCache from litdata in the Lightning SDK could help reduce unnecessary version-related warnings.
For reference, here is the WarningCache implementation in litdata:
https://github.com/Lightning-AI/litdata/blob/4ff18da3bfa04555de7fa92b6287c368c8c27c31/src/litdata/helpers.py#L9-L10
Alternatively, it could be directly utilized from lightning_utilities.
https://github.com/Lightning-AI/utilities/blob/9e88903a7cecd7933da6ef659f1eee22ac652de1/src/lightning_utilities/core/rank_zero.py#L95-L113
Question:
By the way, it seems that rank_zero_only.rank needs to be set in this context. How is that typically configured?
cc: @Borda @tchaton
Yes these warning shall not be duplicated, so let's update the SDK accordingly...
I think warning cache might not work for multiple workers in Dataloader. Rather we should check for it in StreamingDataloader (litdata version).
But, I doubt if this will work for ddp, as multiple dataloaders get instantiated.
I think warning cache might not work for multiple workers in Dataloader. Rather we should check for it in
StreamingDataloader(litdata version).But, I doubt if this will work for ddp, as multiple dataloaders get instantiated.
yeah, makes sense.