Mike
Mike
This looks like an awesome project. Would be great if there was a way to report hyper parameters with each submission.
https://dev-discuss.pytorch.org/t/torch-deploy-the-build/238
https://github.com/vahidk/tfrecord
https://mobile.twitter.com/iamtrask/status/1196081965755248643
https://github.com/NVIDIA/apex/tree/master/apex/pyprof https://mobile.twitter.com/jeremyphoward/status/1187752948744327168
## Train Allow queuing experiments with ranked priority, potentially handle running on a remote machine ```console yann train -m resnet50 -d ImageNet -bs 32 --distributed --priority=2 ``` ## Scaffold Project...
Integrate with the common hyperopt frameworks