njarecki
njarecki
hey guys- here's what i notice- when you use command line infer, it takes much longer than web gui. i assume that's because there' startup overhead associated with the process....
Hi! Yes I think I’m experiencing the exact problem you say. It’s 15 seconds vs 4 for a conversion. How do you set yours up? Would love to do it...
Seth very impressive can u hit me back on [email protected] - yeah let's move offline. May have some work for you if you're interested
In the end we used the stock RVC distribution and used the existing gradio server api endpoints. Had to do a little python scripting to make it work with our...
Google or ChatGPT “how to use gradio api endpoints “--NicholasOn Aug 19, 2023, at 7:04 PM, Rico ***@***.***> wrote: Hello this was translated at DeepL, glad I found your threads....
Is there someway I can do it? What I notice is that using batch infer of standard distribution there’s a bunch of startup time every time I run it. Whereas...