[Bug]: after using caption generation, the model stays in memory and if you immediately go to start training, it doesn't remove
What happened?
loaded up one trainer and fixed model paths etc.. go to tools and dataset tools then generate captions... if you generate them but don't restart, the models are still in gpu memory and never get cleared. this causes training to overflow on my machine to ram. if i restart, it does not present a problem.
it can easily be reproduced by simply running caption generation and watching gpu memory usage. then run training and you will see that the blip models don't get purged from memory.
What did you expect would happen?
the blip or blip2 model should clear memory when i'm done generating captions
Relevant log output
I don't have this.
Output of pip freeze
No response
What happened?
loaded up one trainer and fixed model paths etc.. go to tools and dataset tools then generate captions... if you generate them but don't restart, the models are still in gpu memory and never get cleared. this causes training to overflow on my machine to ram. if i restart, it does not present a problem.
it can easily be reproduced by simply running caption generation and watching gpu memory usage. then run training and you will see that the blip models don't get purged from memory.
What did you expect would happen?
the blip or blip2 model should clear memory when i'm done generating captions
Relevant log output
I don't have this.Output of
pip freezeNo response
Please update and confirm if this still occurs. I do not use the inbuilt models to caption and instead use external tools like Dataset Helpers and Taggui. Both on github.
@gnewtzie Can you please confirm if this still occurs? I will close this in a week if there is no response
did one tonight and the gpu memory stayed higher prior to doing the captioning. seems like the model is still in ram.
Finally closed.