kelly
kelly
when i use predict.py for squad, i got this: line 259, in forward x_input = torch.cat([x1_aug, x2_aug, x1_aug * x2_aug], dim=3) RuntimeError: cuda runtime error (2) : out of memory...
what is the performance now?
> Hi, thanks for your attention to our work. ConaCLIP weights have been released as `alibaba-pai/pai-conaclip-text2L-vit-small-patch16`, `alibaba-pai/pai-conaclip-text4L-vit-small-patch16` and `alibaba-pai/pai-conaclip-text6L-vit-small-patch16`. You can refer to, for example, [Here](https://github.com/alibaba/EasyNLP/tree/master/examples/appzoo_tutorials/clip), to load and use...
only these models
> Hi @susht3 , You mean that you want to export a `CLIPTextModel` and `CLIPVisionModel`? > > We support the CLIP export in `optimum`: > > ```shell > optimum-cli export...
> [`CLIPTextOnnxConfig`](https://github.com/huggingface/optimum/blob/main/optimum/exporters/onnx/model_configs.py#LL620C49-L620C49). thanks,and which is clip visual onxx config? i can't find it
加载hf_altclip,可以import代码了,但是下载不了模型: from hf_altclip.modeling_altclip import AltCLIP from hf_altclip.processing_altclip import AltCLIPProcessor 报错: >>> model = AltCLIP.from_pretrained("BAAI/AltCLIP") Downloading (…)lve/main/config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████| 5.13k/5.13k [00:00
Traceback (most recent call last): File "test.py", line 10, in model = AltCLIP.from_pretrained("BAAI/AltCLIP") File "anaconda3/envs/py3/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1833, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( File "//anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 534, in from_pretrained config_dict,...
加了auth_token=true也不行: Traceback (most recent call last): File /anaconda3/envs/py3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 616, in _get_config_dict resolved_config_file = cached_path( File "/anaconda3/envs/py3/lib/python3.8/site-packages/transformers/utils/hub.py", line 284, in cached_path output_path = get_from_cache( File anaconda3/envs/py3/lib/python3.8/site-packages/transformers/utils/hub.py", line 494, in get_from_cache...
> 这个错误是transformers版本的问题,您可以尝试降低一下版本,比如降低到4.19.x版本 版本4.20和4.19都不行,下载报错:OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks lik e BAAI/AltCLIP is not the path to...