How can I convert an OpenAI endpoint to a local VLLM endpoint?
I deployed the vllm model locally and tried to connect using the void extension, but I can't initiate a conversation. What should I do?
We need more details - can you give a few screenshots of what you tried to do?
Whether it is possible to consider adding an OpenAI-compatible API, adding something like OpenAI_base_url, and being able to use https://xxxx.com/v1.
Just like this, I am trying to connect to a local model by modifying the endpoint url and the model.
@ShikangPang let us know if it's working now! Can you use the openaiCompatible option?
I can use openaiCompatible option normally。