Does batch processing process in parallel?
I was wondering if the batch processing feature on the interface processes all files at once by doing threading (since pm and harvest are on the primarily CPU), or each separately at a time. I was mainly wondering because if you properly separated the pieces of a larger inference audio from the silence, you could process them all at once and potentially reap both performance and quality improvements, for long single file inference.
Yes it's a good idea but I need time to support it.
Yes it's a good idea but I need time to support it.
Hi Any update on this? Is it possible to infer in batch now?
This issue was closed because it has been inactive for 15 days since being marked as stale.