Learning to Simulate: Time and cost of training
I am trying to train the model using the WaterRamps as in the example, and I have two doubts:
- How many steps does it take to fit the model, on average?
- How much time do you estimate that it will take to train the model using CPUs on a laptop (e.g. a i7-11800h)? Is it feasible at all?
Thank you for your help!
I would refer to the main paper for question 1. We trained for 20M steps, but usually performance at 2.5M steps is already very good.
Training this model on a CPU in a reasonable amount of time is probably unfeasible.
Thank you very much for your response! Unfortunately, I don't have a GPU available at the moment to do the training. Would it be possible for you to share the trained model directly? Thanks again!
Thank you unfortunately we will not be able to share the train model weights at this point, but I would encourage you to find other GitHub issues where other users are training this model and perhaps ask there (and link it from here!)