SIRIUS
SIRIUS
> > How can further improve the lib to add the support for meta_data mapping function like: > > ``` > > def item_metadata_func(record: dict, metadata: dict) -> dict: >...
Source code based JSONLoader class modification, only removing the jq library, no problem with simple JSON and JSONL files. ```python class JSONLoader(BaseLoader): def __init__( self, file_path: Union[str, Path], content_key: Optional[str]...
DuckDuckGo需要翻,还有些是不用翻的,国内可以直接用,所以最好是有个可配置项,能用哪个用哪个
有些代理服务商仅支持部分模型,需要使用几个不同的代理服务商的地址
建议删除Langchain-Chatchat/knowledge_base下面的info.db文件及Langchain-Chatchat\knowledge_base\samples\vector_store文件夹,重新进行初始化python init_database.py --recreate-vs
第三个问题我通过修改JSONLoader类实现了无需jq库,因为windows不支持安装jq库
第二个问题我查到是`server\knowledge_base\utils.py`文件中的`make_text_splitter`函数在执行`tokenizer = AutoTokenizer.from_pretrained(text_splitter_dict[splitter_name]["tokenizer_name_or_path"], trust_remote_code=True)`时报的异常,由于ChineseRecursiveTextSplitter的tokenizer_name_or_path是None,这该怎么办?
安装了openmp后,再运行报错不同了 Load parallel cpu kernel failed C:\Users\xuqin\.cache\huggingface\modules\transformers_modules\chatglm2-6b-int4\quantization_kernels_parallel.so: Traceback (most recent call last): File "C:\Users\xuqin/.cache\huggingface\modules\transformers_modules\chatglm2-6b-int4\quantization.py", line 125, in __init__ kernels = ctypes.cdll.LoadLibrary(kernel_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Anaconda3\envs\llm_env\Lib\ctypes\__init__.py", line 454, in LoadLibrary return self._dlltype(name)...
> > 安装了openmp后,再运行报错不同了 > > Load parallel cpu kernel failed C:\Users\xuqin.cache\huggingface\modules\transformers_modules\chatglm2-6b-int4\quantization_kernels_parallel.so: Traceback (most recent call last): File "C:\Users\xuqin/.cache\huggingface\modules\transformers_modules\chatglm2-6b-int4\quantization.py", line 125, in **init** kernels = ctypes.cdll.LoadLibrary(kernel_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Anaconda3\envs\llm_env\Lib\ctypes__init__.py", line 454,...
update to v0.1.19, but same error openllm.exceptions.OpenLLMException: Model type is not supported yet.