Model Checkpointing
Describe the bug
In main.py, the enable_checkpointing=False is hard coded. When I pass a lightning ModelCheckpoint to create_trainer() then it throws the error:
Trainer was configured with "enable_checkpointing=False" but found "ModelCheckpoint" in callback list.
To Reproduce
from deepforest import main
from pytorch_lightning.callbacks import ModelCheckpoint
checkpoint_callback = ModelCheckpoint(
dirpath=ckpt_dir,
save_top_k=1,
monitor="box_recall",
mode="max",
every_n_epochs=1,
)
m = main.deepforest()
m.use_release()
m.create_trainer(callbacks = [checkpoint_callback])
m.trainer.fit(m)
Environment (please complete the following information):
- OS: Ubuntu
- Python version and environment : venv Python 3.8
I had the same issue #301. The fix #304 is not enough.
The hard coded part should be removed (See #298)
Okay, let's look into this. We want 1) default to be no checkpoint, 2) allow users to add checkpoint. First step is to make a branch with two tests that fail in the current workflow.
Creating a branch for this issue https://github.com/weecology/DeepForest/tree/model_checkpoint
Tests pass and merged into main.