It is not possible to use LM Studio
LM Studio is fully compatible with the OpenAI module, except that it does not require any API KEY. I don't know if I'm doing something else, but it is completely impossible to use both Ollama and OpenAI formats, if I use OpenAI, it will look for the default endpoint and not the configured one.... so the KEY is invalid.....
Hmm! Let me take a look tmr to see what's going on, it might be related to the other issue with Open Router (https://github.com/microsoft/data-formulator/issues/143)
I think part of the problem is that if you choose OpenAI as the provider, it doesn't take into account whether the endpoint is local or not, it ‘assumes’ that it is the OpenAI URL for the API. If you select Ollama something similar happens, but at least it gets to LM Studio, although the endpoint is not the same and LM Studio does not respond properly.
Note: In my case, LM Studio serves the model on the local network (yes, it is accessible, confirmed :)).
Btw, I'm wondering if you set that in the .env file (running from source code), or providing it from the website?
Trying to use it in the most normal way: pipinstalling the module, then from the website.
Yes, I figured out the bugs there, just fixed, will push it in a minute :)
Should be fixed now, if you do pip install --upgrade data_formulator, it would update to 0.2.1.1 with no problem using openai endpoint for other clients (e.g., ollama)
Now it works perfectly with LM Studio, first attempts and working very well with local models. Very interesting!