Updating linen_examples/vae
This issue tracks updates the vae example to follow practices outlined in #231.
- [x] Port to linen API - once ported all subsequent changes should be done in
linen_examples/vae - [ ] Update
README.md - [x] Add
requirements.txt - [ ] Update file structure
- [ ] Use
ml_collections.ConfigDict? - [ ] Add benchmark test ?
- [ ] Add unit test for training/eval step
- [ ] Add Colab
- [ ] Adhere to Google Python style
- [ ] Add mypy annotations
- [ ] Shorten/beautify training loop (consider using
clufor this)
Note that "file structure" in #231 got changed (as part of #634):
To make it easier to test, maintain and reuse code structure the code as follows:
main.pycontains the flags and calls a method fromtrain.pyto run the training loop. This should be the only file defining flags! This can be almost identical for all examples, please copy fromlinen_examples/wmt/main.py.train.pycontains classes and methods for training and evaluating the model.train_test.pyfor test cases of the training code. At a minimum the test should run a single training step but more fine grained unit tests are a bonus. You can usetfds.testing.mock_datato avoid real data from disk.
Unassigning myself and adding "pull requests welcome"
Mind if I take a run at this? I might not be able to do everything in the checklist but can certainly move this along
Yes, please have a run.
Happy to answer questions and review PRs.
Thanks! Feel free to assign this to me, I'll likely open up multiple smaller PRs for subpieces rather than one open one. If that doesn't sound like a good plan please let me know!
We should reopen the issue and mark some of the tasks as done
Even mypy and Google Style Guide can be checked IMO. Gentle ping @levskaya
Updated Tracker
- [x] Port to linen API
- [x] Update README.md
- [x] Add requirements.txt
- [x] Update file structure
- [x] Use
ml_collections.ConfigDict - [ ] Add benchmark test
- [ ] Add unit test for training/eval step
- [ ] Add Colab
- [x] Adhere to Google Python style
- [x] Add mypy annotations
- [ ] Shorten/beautify training loop (consider using
clufor this)
Request for permission to modify
The current vae example does not work, may I make minor modifications?
Modifications
Two modifications are as follows:
- modify requirements.txt
- add clu==0.0.6
- upgrade flax==0.6.9 To flax==0.7.4
- modify README.md
- remove "--workdir=/tmp/mnist" from a execution command
Reasons
-
flax==0.7.4
-
Update handling of typed PRNG keys in version 0.7.3
-
flax -0.7.3 is yanked
-
-
remove "--workdir=/tmp/mnist"
- commit hash db7b1762
- workdir flag definition is removed in main.py
- commit hash db7b1762
Yes, please go ahead and create a PR to fix the example.
Thank you very much. I have created a PR, please check it.