data-formulator icon indicating copy to clipboard operation
data-formulator copied to clipboard

How do I connect to DeepSeek supported by Tencent Cloud?

Open Archer456 opened this issue 11 months ago • 7 comments

I want to connect to deepseek-v3 supported by Tencent Cloud I refer to this https://cloud.tencent.com/document/product/1772/115969 and this https://docs.litellm.ai/docs/providers/openai_compatible

Image

How do I configure this model connection?

I can already connect directly to deepseek,Just insufficient balance。

Image

But how do I connect to DeepSeek supported by Tencent Cloud? How do I configure my endpoint, model, and api_base correctly?

Archer456 avatar Feb 14 '25 06:02 Archer456

It seems the LiteLLM (the library we currently use for LLM clients) does not support Tencent Cloud. We might consider adding the support if there is enough interests.

Potentially Ollama with larger-sized open sourced model is the best option for you at the moment?

Chenglong-MS avatar Feb 14 '25 18:02 Chenglong-MS

It seems the LiteLLM (the library we currently use for LLM clients) does not support Tencent Cloud. We might consider adding the support if there is enough interests.似乎是Litellm(我们目前用于的库LLM客户端)不支持腾讯云。如果有足够的兴趣,我们可能会考虑添加支持。

Potentially Ollama with larger-sized open sourced model is the best option for you at the moment?潜在的Ollama具有大型开源型号是目前最适合您的选择吗?

Thank you for accessing and responding. Before February 14, that is, before I updated data_formulator 0.1.5, which means I installed version 0.1.4, I successfully connected to the deepseek service provided by Tencent Cloud in the following way, as shown in my installation notes at that time: Image It means that in version 0.1.4, you support connecting to Tencent Cloud's deepseek service. It was only in version 0.1.5 that I found that you used a new link configuration method and introduced the LiteLLM project to manage AI service providers. Under the new LiteLLM project, I couldn't try to figure out how to fill in the correct model, api_base, and endpoint.

Archer456 avatar Feb 17 '25 01:02 Archer456

感谢!看起来是腾讯云可以和openAI的API兼容,但没法和LiteLLM的API兼容。我看看有没有什么好的办法hack一下这个。我可能打算开一个单独的branch来对这个support。

Chenglong-MS avatar Feb 18 '25 16:02 Chenglong-MS

感谢!看起来是腾讯云可以和openAI的API兼容,但没法和LiteLLM的API兼容。我看看有没有什么好的办法hack一下这个。我可能打算开一个单独的branch来对这个support。

oh,你会中文,太好了,我就不用翻译又怕英文用的不是专业术语表达不清楚需求了。 是的,在上一个版本,还可以通过环境变量导入 OPENAI_BASE_URL="https://api.deepseek.com/v1" 和 OPENAI_BASE_URL="https://api.lkeap.cloud.tencent.com/v1" 来分别连接 deepseek 和 腾讯云提供的deepseek的, 升级到0.1.5之后,就是用新的LiteLLM 之后,deepseek还可以连,但 腾讯云的deepseek,我也仔细看了 https://docs.litellm.ai/docs/providers 没有找到合适的兼容方式,也乱尝试填过以下: model: deepseek/deepseek-chat deepseek/deepseek-v3 openai/deepseek-v3 lkeap.cloud.tencent/deepseek-v3 api_base:https://api.lkeap.cloud.tencent.com/v1 也都宣告连接失败,从 项目的即时日志里,看到 好像还是请求的是 https://api.deepseek.com/beta/chat/completions 这个地址。 比如 我如下填:

Image 日志里

Image

好像 我觉得那个 api_base:https://api.lkeap.cloud.tencent.com/v1` 并没有正确生效,按我的理解,它应该去连接请求这个地址。而不是继续请求 api.deepseek.com。

Archer456 avatar Feb 19 '25 01:02 Archer456

hello! 我们刚在dev branch更新了一下openai API的方法,这个方法和0.1.4版本的API一样,你可以尝试使用provider=openai, 然后输入api_base, api_key, model来连接。

现在dev branch还没有更新到pypi上,需要自己在local build一下:https://github.com/microsoft/data-formulator/blob/dev/DEVELOPMENT.md

Chenglong-MS avatar Feb 20 '25 00:02 Chenglong-MS

这个fix已经merge到新的release里面了,可以试一试: https://github.com/microsoft/data-formulator/releases/tag/0.1.6

Chenglong-MS avatar Feb 20 '25 23:02 Chenglong-MS

这个fix已经merge到新的release里面了,可以试一试: https://github.com/microsoft/data-formulator/releases/tag/0.1.6 十分感谢,已经在0.1.6修复了,以后可以连接像腾讯云支持的deepseek了。

连接填入项:

endpoint :openai

api_key:腾讯云处申请的api_key

model: deepseek-r1 或 deepseek-v3

api_base:https://api.lkeap.cloud.tencent.com/v1

Image 该条issue,可以关闭了。

Archer456 avatar Feb 21 '25 01:02 Archer456