Using `find_executable_batch_size` decorator with `fastai`
Hello,
I am trying to use the find_executable_batch_size decorator to find the optimal batch size in fastai, however, I'm having some trouble navigating around it.
In fastai, the batch size gets fixed when we create a dataloader. I tried using the decorator in this fashion, but this way, the code does put the batch on the GPU.
@find_executable_batch_size(starting_batch_size=1024)
def get_dl(batch_size):
return ImageDataLoaders.from_name_func(path, files, label_func, item_tfms=Resize(224), bs=batch_size)
I had even tried an alternate approach where,
learn = vision_learner(get_dl(), resnet34, metrics=error_rate, )
I'm not sure how I would go about it. AFAIK, batches are put on the GPU, once we call, the learn.fine_tune or learn.fit_one_cycle. How can I use the decorator in that case? TIA!
@deven-gqc a fastai callback would need to be made really in order for this to be very efficient.
However generally the basic premise of what the guide states is still accurate:
To use it, restructure your training function to include an inner function that includes this wrapper, and build your dataloaders inside it
This means doing something like:
@find_executable_batch_size(starting_batch_size=1024)
def my_train_func(batch_size:int):
learn = get_learner(get_dl(batch_size), resnet34, metrics=error_rate)
learn.fit_one_cycle(2)
my_train_func()
Thanks @muellerzr I got it to work with the function. However, the output gets a little messy. I think if there was a separate fastai callback, it would fix the issue.

Great! @deven-gqc will be trying to look into something like that soon. I'll go ahead and mark this a feature request so it stays open :)
Thanks a lot Zach!
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.