Support for Ollama servers?
Hi all,
I continue to be astonished at the speed and quality of development here, thanks to everyone involved for something which has made life better and easier. I thought I would ask, if I may, for a feature which might be more useful as time passes. Ollama https://ollama.com/
is a method through which LLMs can be run on the local intranet. There is an accessible option for these, though at an early stage, at
https://github.com/chigkim/VOLlama/
Would it be possible for the add-on to support sending data to an Ollama server/model? It would be particularly nice to be able to send NVDA's objects to the model as well, of course, as allowing work with the local models through the central dialogue. Thanks for looking into whether this would be possible.
Hello, Yes, it is in progress. In #62. The next release should include this :) Thanks
Thanks very much, I'm sorry I didn't check that before, I'm not as familiar as I ought to be with github. As always, I really appreciate your speed.
On 2/14/2024 4:39 PM, André-Abush Clause wrote:
Hello, Yes, it is in progress. In #62 https://github.com/aaclause/nvda-OpenAI/pull/62. The next release should include this :) Thanks
— Reply to this email directly, view it on GitHub https://github.com/aaclause/nvda-OpenAI/issues/64#issuecomment-1944687648, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIC3JC64HWNLGCQLEIUSE6DYTUVKFAVCNFSM6AAAAABDIU6IMWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBUGY4DONRUHA. You are receiving this because you authored the thread.Message ID: @.***>
--------------5E4218BD9CAA82556E2A6506 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: 8bit
Thanks very much, I'm sorry I didn't check that before, I'm not
as familiar as I ought to be with github. As always, I really
appreciate your speed.
<p dir="auto">Hello,<br> Yes, it is in progress. In <a moz-do-not-send="true" class="issue-link js-issue-link" data-error-text="Failed to load title" data-id="2128075248" data-permission-text="Title is private" data-url="https://github.com/aaclause/nvda-OpenAI/issues/62" data-hovercard-type="pull_request" data-hovercard-url="/aaclause/nvda-OpenAI/pull/62/hovercard" href="https://github.com/aaclause/nvda-OpenAI/pull/62">#62</a>. The next release should include this :)<br> Thanks</p> <p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">—<br> Reply to this email directly, <a moz-do-not-send="true" href="https://github.com/aaclause/nvda-OpenAI/issues/64#issuecomment-1944687648">view it on GitHub</a>, or <a moz-do-not-send="true" href="https://github.com/notifications/unsubscribe-auth/AIC3JC64HWNLGCQLEIUSE6DYTUVKFAVCNFSM6AAAAABDIU6IMWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNBUGY4DONRUHA">unsubscribe</a>.<br> You are receiving this because you authored the thread.<img moz-do-not-send="true" src="https://github.com/notifications/beacon/AIC3JC45XI755NHPUHU7WDDYTUVKFA5CNFSM6AAAAABDIU6IMWWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTTT5GKCA.gif" alt="" height="1" width="1"><span style="color: transparent; font-size: 0; display: none; visibility: hidden; overflow: hidden; opacity: 0; width: 0; height: 0; max-width: 0; max-height: 0; mso-hide: all">Message ID: <span><aaclause/nvda-OpenAI/issues/64/1944687648</span><span>@</span><span>github</span><span>.</span><span>com></span></span></p> <script type="application/ld+json">[{ @.": "http://schema.org", @.": "EmailMessage", "potentialAction": { @.": "ViewAction", "target": "https://github.com/aaclause/nvda-OpenAI/issues/64#issuecomment-1944687648", "url": "https://github.com/aaclause/nvda-OpenAI/issues/64#issuecomment-1944687648", "name": "View Issue" }, "description": "View this Issue on GitHub", "publisher": { @.": "Organization", "name": "GitHub", "url": "https://github.com" } } ]
--------------5E4218BD9CAA82556E2A6506--