Save Golden Chunks during Bigtable training
Saving the training examples that were sampled from BT would be really useful for #591
can we persist the keyset we use? IIRC, we fetch all the keys, shuffle them in memory, and then turn them into the dataset. can we write out just the set of keys? Then, someone can repeat the training with the same examples right out of the bigtable.
Definitely. Passing values_only=False to bigtable_input.get_unparsed_moves_from_last_n_games will return a dataset with (key, value) tensors. Applying .map(lambda k, v: k) to that dataset gives you the keys.
However, to use those keys again you need to pass the key dataset to tf.contrib.bigtable.BigtableTable.lookup_columns, which can be slow. It might be more expedient to simply store the final selected TFExamples in order in a TFRecord file, and reread from there.