PromptExpert
PromptExpert
> https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/0cc0ee1bcb4c24a8c9715f66cede06601bfc00c8/modules/extra_networks.py#L69 > > I found this variable `extra_network_registry`={'hypernet': }, lora is not registried after api launch. And lora was loaded as extension in this line: > > https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/0cc0ee1bcb4c24a8c9715f66cede06601bfc00c8/webui.py#L203 >...
我根据[这个](https://langchain.readthedocs.io/en/latest/modules/llms/integrations/huggingface_hub.html) 尝试了一下, 报错 `Traceback (most recent call last): File "/data/wuhaixu/mygpt/ChatGLM-6B/tests/test_langchain.py", line 7, in llm_chain = LLMChain(prompt=prompt, llm=HuggingFaceHub(repo_id="THUDM/chatglm-6b", model_kwargs={"temperature":0, "max_length":64})) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__ pydantic.error_wrappers.ValidationError: 1 validation error for...
@imClumsyPanda > @NLPpupil 目前已经通过创建CustomLLM类成功实现了,后面准备发个repo 帅呀,求教怎么弄的?在线等,挺急的。
@imClumsyPanda 谢谢大佬,我好像会了
interested in this work, 资瓷。
I encounter the same error. updating the latest modeling_baichuan.py still not work. Then I add model.enable_input_require_grads() , it works. Besides, my lora_target_modules only has 'o_proj' , maybe you can try...
@xxTree 哥们,掼蛋是怎么实现的,求分享。
依存句法分析好像就是不够快的,我试过HanLP和CoreNLP的依存句法分析,HanLP是20-30ms,CoreNLP是2-4ms。