[Bug]: NotImplementedError
Describe the bug
llmlingua-2 model is downed in hugging facing
code:
self.llm_lingua = PromptCompressor(
model_name="/home/webservice/llm/compressFromNet/llmlingua-2-xlm",
use_llmlingua2=True, # Whether to use llmlingua-2
)
compressed_prompt = self.llm_lingua.compress_prompt(
context, # context is str type
rate=0.33,
force_tokens = ['\n', '?'],
drop_consecutive=True,)
print(compressed_prompt)
err:
compressed_prompt = self.llm_lingua.compress_prompt(
File "/home/webservice/miniconda3/envs/tka/lib/python3.10/site-packages/llmlingua/prompt_compressor.py", line 472, in compress_prompt
return self.compress_prompt_llmlingua2(
File "/home/webservice/miniconda3/envs/tka/lib/python3.10/site-packages/llmlingua/prompt_compressor.py", line 776, in compress_prompt_llmlingua2
context_probs, context_words = self.__get_context_prob(
File "/home/webservice/miniconda3/envs/tka/lib/python3.10/site-packages/llmlingua/prompt_compressor.py", line 2134, in __get_context_prob
) = self.__merge_token_to_word(
File "/home/webservice/miniconda3/envs/tka/lib/python3.10/site-packages/llmlingua/prompt_compressor.py", line 2199, in __merge_token_to_word
elif is_begin_of_new_word(token, self.model_name, force_tokens, token_map):
File "/home/webservice/miniconda3/envs/tka/lib/python3.10/site-packages/llmlingua/utils.py", line 94, in is_begin_of_new_word
raise NotImplementedError()
NotImplementedError
Steps to reproduce
No response
Expected Behavior
No response
Logs
No response
Additional Information
No response
Hi @isAXuan, thanks for your feedback.
It looks like the local model issue. Could you try to set model_name=microsoft/llmlingua-2-xlm-roberta-large-meetingbank for now?
Thank you very much for your reply. My problem has been solved. I checked my environment and it should be caused by the naming of the local model weight file.
Thank you very much for your reply. My problem has been solved. I checked my environment and it should be caused by the naming of the local model weight file.
@jiapingW 你好,我也遇到了相同的问题,跟命名有什么关系呢?请问下你最后是怎么解决的呢?
Thank you very much for your reply. My problem has been solved. I checked my environment and it should be caused by the naming of the local model weight file.
@jiapingW 你好,我也遇到了相同的问题,跟命名有什么关系呢?请问下你最后是怎么解决的呢?
rename the local dir where you download the model from hf to include "xlm-roberta-large" in dir name