void icon indicating copy to clipboard operation
void copied to clipboard

How can I convert an OpenAI endpoint to a local VLLM endpoint?

Open ShikangPang opened this issue 1 year ago • 3 comments

I deployed the vllm model locally and tried to connect using the void extension, but I can't initiate a conversation. What should I do?

ShikangPang avatar Sep 25 '24 01:09 ShikangPang

We need more details - can you give a few screenshots of what you tried to do?

andrewpareles avatar Sep 30 '24 08:09 andrewpareles

Whether it is possible to consider adding an OpenAI-compatible API, adding something like OpenAI_base_url, and being able to use https://xxxx.com/v1.

cppcloud avatar Oct 02 '24 16:10 cppcloud

image Just like this, I am trying to connect to a local model by modifying the endpoint url and the model.

ShikangPang avatar Oct 06 '24 03:10 ShikangPang

@ShikangPang let us know if it's working now! Can you use the openaiCompatible option?

andrewpareles avatar Oct 23 '24 09:10 andrewpareles

I can use openaiCompatible option normally。

ShikangPang avatar Oct 26 '24 10:10 ShikangPang