Fetch available models from chatgpt before asking question
Closes #6
Fetches the available model names using ChatGPT API.
This ensures that the models available for the user are used. For ChatGPT plus subscribers the paid model is used, making availability better.
Thanks for opening this. Looks good to me, except could you store the model name in the browser cache instead of in a local variable? That way we don't have to run this function every time the user starts a new session.
@shobrook Yeah that sounds like a good idea! I already pushed the changes.
I have added it to the same cache as access token. This way if OpenAI pushes a new model (like happend today), that model will be used when the cache expires.
Hi @shobrook,
Any more changes needed?