Luis Fernando Leal

Results 9 comments of Luis Fernando Leal

Any news on this? or some status page we can follow?

For me , adding this line before any code do the trick: tf.reset_default_graph()

Hi @ArthurZucker , thanks for your reply, where should I specifiy that parameter? I'm facing a similar issue ([https://stackoverflow.com/questions/79068298/valueerror-supplied-state-dict-for-layers-does-not-contain-bitsandbytes-an](https://stackoverflow.com/questions/79068298/valueerror-supplied-state-dict-for-layers-does-not-contain-bitsandbytes-an)) and I would trully appreciate your help. @1049451037 did you solve...

Thanks @SunMarc , yes, i'm using the latest version of both transformers and bitsandbites, I created the collab as you suggested(same issue there): https://colab.research.google.com/drive/1BGlj8zJYisJaJNIwjLinukcaaLMPuFAC?usp=sharing For context my use case is...

@SunMarc thank you so much! it seems like it worked!, in the colab and locally I was able to load the saved quantized model, not in the remote server I...

@SunMarc I apologize, this maybe not the original problem but is defintely related and could save a lot of time with some of your help. The fix you provided solved...

@Pratik-Ghute were you able to solve it? I have the same issue. @aymenbenammar any recommendations? Based on the error message i t looks as load_image expects an image path as...

Ignore my message, for some moment I forgot the code is avaialble for me to see: https://github.com/IDEA-Research/GroundingDINO/blob/856dde20aee659246248e20734ef9ba5214f5e44/groundingdino/util/inference.py#L39

Has anyone found a solution? I'm facing the same issue, the function runs 4 times, it seems like 1 time per GPU available.