Timothé
Timothé
> It is also in the documentation: https://stable-baselines3.readthedocs.io/en/master/guide/tensorboard.html#directly-accessing-the-summary-writer. For non-advanced users, it might not be so clear that hyperparameters can be logged in this way (and it is not recommended)....
I have done it with a **callback** (I used SB3, but I think that it should work with previous versions). ```python import os from tensorboard.plugins.hparams import api as hp from...
You are right @rogierz, metric values that are passed to `HParam` through the `metric_dict` won't be saved. They are supposed to reference metrics that have been logged separately (otherwise they...
> @timothe-chaumont could you review/test this one? Yep I'll have a look at it by the end of the week :)
Thank you @rogierz. The changes proposed in this PR follows the implementation in [pytorch tensorboard writer](https://github.com/pytorch/pytorch/blob/master/torch/utils/tensorboard/writer.py#L345-L346), and if added to SB3 they would have the following impact: - When custom...
> yes, look like a better fix, but hparams is asking for a non-empty metric dict (according to SB3 docstring), so you would assign zero values to all of them?...
@rogierz do you want to add this change: > Modify the HParam class to ask for metrics names only (without values) > ```python > HParam(hparam_dict, metric_list=['train/value_loss', 'custom_metric']) > ``` Or...