Ishan Kumar

Results 14 comments of Ishan Kumar

Hi @vfdev-5 Great idea of using `pytest.mark.parameterize` A simple fix in https://github.com/pytorch/ignite/blob/3a286b1d13a4a0476b3ee8e8046f16465818c9f6/tests/ignite/handlers/test_time_limit.py#L18-L39 would be to write it as ```python @pytest.mark.parametrize("n_iters, limit",[(20, 10),(5,10)]) def test_terminate_on_time_limit(n_iters, limit): started = time.time() trainer =...

@sdesrozis @vfdev-5 I have added the tests, a lot of them are similar to the EarlyStopping tests and might not be even needed since they both use the NoImprovementHandler functions....

@sdesrozis I was thinking about the redundancies, while it's true that they both call the same function do you think it is still worth having retaining the `test_state_dict` as it...

Hi @sdesrozis and @arisliang, I am interested in taking this up. From what I understand, we should create a Base class called something like `NoImprovementHandler` and then use it to...

Hi @sdesrozis and @vfdev-5, I think this is a great idea and much needed tool. I am interested in implementing it. I agree with @vfdev-5's idea of having it in...

Hi @vfdev-5, sorry for the delay. I'll have a look at it today :)

@sdesrozis I tested the code on my local example, with this additional code ```python # Define a PT Profiler pt_profiler = PyTorchProfiler(on_trace_ready="tensorboard", output_path="./logs/train") pt_profiler.attach(trainer) ``` it produces 3 json files...

@sdesrozis great, will start working on the tests.

@sdesrozis, Added some tests for the profiler. I have not added checks for the output of the profiler since I believe that is already done by PyTorch. I am new...

@sdesrozis I have incorporated most of your suggestions. I am still working on the distributed tests will add those soon too.