yuege613

Results 13 comments of yuege613

the same promble with local 13-b model, much bad than their demo

the same problem and the api answer is very bad.

确实,被盗用了两次了

刚生成的apikey,部署上去不一会儿就被盗用了,嘎嘎调用GPT-4

解决了吗,遇到同样的问题了,以前还可以正常用

root@58c8455c9d58:/home/model_hub# CUDA_VISIBLE_DEVICES=1,2 python3.9 -m fastchat.serve.cli --model-path Baichuan2-13B-Chat-V1 --num-gpus 2 Xformers is not installed correctly. If you want to use memory_efficient_attention to accelerate training use the following command to install Xformers...

> root@58c8455c9d58:/home/model_hub# CUDA_VISIBLE_DEVICES=1,2 python3.9 -m fastchat.serve.cli --model-path Baichuan2-13B-Chat-V1 --num-gpus 2 Xformers is not installed correctly. If you want to use memory_efficient_attention to accelerate training use the following command to install...

@SimFG ``` def initialize(self): try: connections.connect(**self.conn_config, timeout=self.client_timeout) # timeout=3 [cannot set] if utility.has_collection(self.kb_name, timeout=self.client_timeout): self.sess = Collection(self.kb_name) logger.info(f'collection {self.kb_name} exists') else: schema = CollectionSchema(self.fields) logger.info(f'create collection {self.kb_name} {schema}') self.sess =...

@SimFG ``` --> 182 return func(self, *args, **kwargs) File ~/miniconda3/lib/python3.12/site-packages/pymilvus/decorators.py:122, in retry_on_rpc_failure..wrapper..handler(*args, **kwargs) 120 back_off = min(back_off * back_off_multiplier, max_back_off) 121 else: --> 122 raise e from e 123 except...

[milvus_part.log](https://github.com/milvus-io/milvus/files/15383643/milvus_part.log) @SimFG docker