fragments icon indicating copy to clipboard operation
fragments copied to clipboard

Custom Anthropic baseURL can't response output result

Open pzc163 opened this issue 1 year ago • 3 comments

in models.ts I change the anthropic and openai's baseURL as below shown:

export function getModelClient(model: LLMModel, config: LLMModelConfig) { const { id: modelNameString, providerId } = model const { apiKey, baseURL } = config

const providerConfigs = { anthropic: () => createOpenAI({ apiKey: apiKey || process.env.ANTHROPIC_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString), openai: () => createOpenAI({ apiKey: apiKey || process.env.OPENAI_API_KEY, baseURL: 'https://api.xhub.chat/v1' })(modelNameString),

The application can run and the LLM response can be seen. But it won't give me back the result of running the code and can't preview. The log shows:

model { id: 'claude-3-5-sonnet-20240620', provider: 'Anthropic', providerId: 'anthropic', name: 'Claude 3.5 Sonnet', multiModal: true } config { model: 'claude-3-5-sonnet-20240620' } POST /api/chat 200 in 31541ms

I don't have a default anthropic API Keys, so how can i do to solve this problem?

pzc163 avatar Sep 06 '24 16:09 pzc163

Have you set your E2B_API_KEY in environment variable?

mishushakov avatar Sep 06 '24 16:09 mishushakov

Have you set your E2B_API_KEY in environment variable?

Yes,E2B_API_KEY has been set already

pzc163 avatar Sep 07 '24 03:09 pzc163

I was unable to reproduce this issue

mishushakov avatar Feb 19 '25 15:02 mishushakov