RoyaltyLJW
RoyaltyLJW
hi, @meakbiyik Have you found any solution to deal with the problem? I find that if i break before all the frame extracted, stream.close causes everything to hang
@meakbiyik Thanks a lot. It fix my deadlock issue
What's more, memory leak happen in the code below due to opencv operation ```python def generator_mode(input_list): input_path,threads = input_list start = time.time() graph = bmf.graph() video = graph.decode({ 'input_path': input_path,...
Thanks, I will try it. Is it can run in cpu?
It seems that it use encode module to save the frame in JPEG format, but i do not want to save it as a file. Instead, I want to get...
Thanks a lot, the ffmpeg version I test with two ways is the same. I will try it later
> Hello! This error may be caused by the I/O virtualization of the device, which blocks the P2P communication between the cards, I solve it by disabling the I/O virtualization...
Yes, and I tried some of the methods mentioned in these two issues, but it doesn't work. Set the system prompt length more than 50 is also useless
@kratorado @Tejaswgupta What type of GPU do you use for inference? nvidia-A series (A30) or nvidia-L series (L20) I tried to infer on A30, but failed to reproduce. As mentioned...
> @RoyaltyLJW I'm using A100-80GB. It produces gibberish like this https://pastebin.com/fvy3DsSH > > ``` > $ nvcc --version > nvcc: NVIDIA (R) Cuda compiler driver > Copyright (c) 2005-2024 NVIDIA...