Integration: Support Deepseek as New AI Backend in Karpor
What would you like to be added?
Add support for Deepseek integration in Karpor by adding deepseek as new backend in AI-related parameters. Allow minimal configuration by omitting baseUrl and model (default to deepseek-chat).
Minimal Usage:
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.backend=deepseek
Full Usage:
helm install karpor-release kusionstack/karpor \
--set server.ai.authToken=YOUR_AI_TOKEN \
--set server.ai.model=deepseek-reasoner \
--set server.ai.backend=deepseek
Other backend usage: https://www.kusionstack.io/karpor/getting-started/installation#enable-ai-features
After implement it, can apply to join deepseek-ai/awesome-deepseek-integration
Why is this needed?
This feature will enable users to quickly use Deepseek within Karpor, improving the AI capabilities and user experience. It simplifies configuration while maintaining flexibility for advanced setups.
I will try my best to do this task.
@Cookiery Very welcome! And, after this task is completed, we will post articles to promote it and declare that you are the feature developer! Hope you don't mind.
is it assign to @Cookiery?
is it assign to @Cookiery?
Hi, thank you for your attention. You could see more implement details in this pr https://github.com/KusionStack/karpor/pull/797. By the way, we welcome that technology enthusiasts to contribute to the project.
@brian-villa Yeah, @Cookiery has finished it (as @fanfan-yu mentioned pr), and the issue will be closed after the document is updated. We also plan to support more ai backend to facilitate use to llm. If you have any suggestions, you are welcome to submit issue!