ccly1996

Results 23 comments of ccly1996

hello,excuse me, could you display your requirements details?thank you

emmm, I am deploying this code, but wrong happened when i install requirements, can you help me?The author may write something wrong in requirements, so i need requirements list.😂

thanks,i install these requirements ,but something wrong.

the wrong is from torchvision, when i import torchvision.models.i think this wrong from the version of torchvision, but the author did not note that.

> what's your LLM_MODEL in .env, and can you show your curl example to access your Qwen llm service? this is example ![image](https://github.com/eosphoros-ai/DB-GPT/assets/39228103/2701d859-5ec7-41da-8c3c-30b2fe563690) LLM_MODEL in .env is proxyllm

> check your `PROXY_SERVER_URL` in .env, and you can debug what base_url in dbgpt/model/proxy/llms/chatgpt.py . i use the same PROXY_SERVER_URL in between .env and curl,but also appear the openai.APIConnectionError: Connection...

we test the effect of cogvlm2 is perfect

> 您好,执行“bash ./scripts/test_inference_quantized.sh 0 ./tests/test_prompt.txt”, 请问这个怎么处理呢 File "/opt/conda/envs/python38/lib/python3.8/site-packages/codegeex-1.0-py3.8.egg/codegeex/kernels/**init**.py", line 15, in **init** raise RuntimeError("File `%s` not found in `%s`" % (filename, RESOURCE_PACKAGE_NAME)) RuntimeError: File `quantization.fatbin` not found in `codegeex.kernels` 我也是这个问题,请问你解决了吗

> > 解决了就好哈哈 我也碰到这个问题了,重装了也不行 ![image](https://github.com/open-mmlab/playground/assets/39228103/4b7ae55f-51fb-4902-a020-5708437fcb86)

Can codexgraph support locally deployed models?