FastGPT icon indicating copy to clipboard operation
FastGPT copied to clipboard

When the context length of the QA model is set to more than 32k, a bug is suspected.

Open zenyanbo opened this issue 2 years ago • 1 comments

例行检查

  • [x] 我已确认目前没有类似 issue
  • [x] 我已完整查看过项目 README,以及项目文档
  • [x] 我使用了自己的 key,并确认我的 key 是可正常使用的
  • [x] 我理解并愿意跟进此 issue,协助测试和提供反馈
  • [x] 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

你的版本

  • [ ] 公有云版本
  • [x] 私有部署版本

问题描述 当QA模型的上下文长度设置为32k以上,导入文件时会瞬间完成,单数数据库中没有任何片段。看不到任何报错。

复现步骤 config.json中maxContext修改为32000,就会发生。改回16000就正常出现队列。

预期结果

相关截图

zenyanbo avatar Jan 22 '24 16:01 zenyanbo

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Routine inspection

  • [x] I have confirmed that there is currently no similar issue
  • [x] I have fully reviewed the project README, as well as project documentation
  • [x] I used my own key and confirmed that my key can be used normally
  • [x] I understand and am willing to follow up on this issue, assist in testing and provide feedback
  • [x] I understand and acknowledge the above content, and understand that project maintainers have limited energy. Issues that do not follow the rules may be ignored or closed directly

your version

  • [ ] Public cloud version
  • [x] Private deployment version

Problem Description When the context length of the QA model is set to 32k or more, importing the file is done instantly without any fragments in the singular database. Can't see any error.

Steps to reproduce It will happen if the maxContext in config.json is changed to 32000. Change it back to 16000 and the queue will appear normally.

expected outcome

Related screenshots

c121914yu avatar Jan 22 '24 16:01 c121914yu

你确定你的模型支持?

c121914yu avatar Jan 23 '24 14:01 c121914yu

支持的,mixtral 8x7b还有gpt4-1106。不过也许还有其他变量影响这个问题,没法定位问题的具体位置。有空我再看看什么情况。

zenyanbo avatar Jan 23 '24 15:01 zenyanbo

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Supported, mixtral 8x7b and gpt4-1106. However, there may be other variables affecting this problem, and it is impossible to locate the specific location of the problem. I'll see what happens when I have time.

c121914yu avatar Jan 23 '24 15:01 c121914yu