yixingfu
yixingfu
Random search with a rather large `max_trial` does the job. Random search ensures non repeating search, and will exit if not possible to populate next search space
369 should have fixed this issue. Before Keras Tuner's next release, you can try `pip install git+git://github.com/keras-team/keras-tuner@master`
@henry090 is it the same issue (local file system unimplemented, happening when checkpointing)?
> @yixingfu This is what I get: > > ``` > AttributeError: module 'tensorflow._api.v2.distribute' has no attribute 'TPUStrategy' > ``` @henry090 I see. That is a slightly different thing. This...
Can you share a colab so I can take a look?
> @yixingfu Sorry for the late reply, https://colab.research.google.com/drive/1tQuB6v-_b09lQQN7EkQeV9CzvndCO6SZ?usp=sharing You need to use `strategy = tf.distribute.TPUStrategy()` instead of `tf.distribute.experimental.TPUStrategy()`. That being said, it looks like the checkpointing callbacks are indeed trying...
> @yixingfu any news? I'm kind of wondering on this. Looks like the checkpoint is not being saved. This [gist](https://gist.github.com/yixingfu/8711beaa3b3d508b037a5195a35db88e) shows a working example of using Keras Tuner directly. Not...
One thing you can do is ``` from functools import partial def build_model_with_extra_args(hp, some_val_1, some_val_2): ... build_model = partial(build_model_with_extra_args, hp, some_val_1=value_for_some_val_1, some_val_2=value_for_some_val_2) ``` In Keras Tuner, you always need to...
This looks very good! But I don't think the original [paper for hyperband](https://arxiv.org/pdf/1603.06560.pdf) suggest specifically about whether to use weight from previous round (or maybe I just went through the...
Keras-Tuner requires callbacks to be deep-copyable because it need a fresh copy of the same callback for each trial (each time the tuner calls `model.fit`). I don't think there is...