Support for local models
Hi! Your plugin looks great and I'd want to use it, but does it support locally running models? If yes, how can I use them?
Thanks for your feedback. Not currently supported, I can add this feature if needed. Does configuring the URL of the local model in the plugin satisfy this requirement ?
If not, let me know how you'd like to use it and I'll be happy to support.
I think it should satisfy the need if locally running OpenAI compatible server would be used by specifying it's address.
:tada: This issue has been resolved in version 1.4.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket: