Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[FEATURE] support for lm studio

Open pechaut78 opened this issue 2 years ago • 11 comments

Lm studio.is super easy to setup, and simpler than local ai.

It mimics openai api. Langchain supports it by passing a local base path..

Would bé wonderful to do thé same thing with flowise

pechaut78 avatar Nov 24 '23 02:11 pechaut78

I'm with you and frankly a little puzzled why this isn't already supported.

jeffthorness avatar Jan 22 '24 20:01 jeffthorness

@dev: just load openai llm like that (python)

llm = ChatOpenAI(base_url="http://localhost:1234/v1")

SphaeroX avatar Jan 31 '24 11:01 SphaeroX

image Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter

KennyVaneetvelde avatar Feb 02 '24 14:02 KennyVaneetvelde

Excellent !!

Thanks a lot

Cordialement, CHAUT Pierre-Emmanuel


De : KennyVaneetvelde @.> Envoyé : vendredi 2 février 2024 15:18 À : FlowiseAI/Flowise @.> Cc : pechaut78 @.>; Author @.> Objet : Re: [FlowiseAI/Flowise] [FEATURE] support for lm studio (Issue #1276)

image.png (view on web)https://github.com/FlowiseAI/Flowise/assets/48944754/d356c9a8-6ab2-488c-b88b-0898e114f14c Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter

— Reply to this email directly, view it on GitHubhttps://github.com/FlowiseAI/Flowise/issues/1276#issuecomment-1923978328, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAE2VARKD6C6NUEBPCRLJ43YRTYUNAVCNFSM6AAAAAA7YQB5COVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRTHE3TQMZSHA. You are receiving this because you authored the thread.Message ID: @.***>

pechaut78 avatar Feb 02 '24 15:02 pechaut78

Is there a website where they share workflows?

shiloh92 avatar Feb 08 '24 07:02 shiloh92

@KennyVaneetvelde I have tried this but didn't work. I have actually a flowise instance on DO but LMStudio on my local laptop. Do i have to have both locally for this to work? would love a DO deployment solution! Thanks in advance

ArkMaster123 avatar Mar 14 '24 22:03 ArkMaster123

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用 http://host.docker.internal:1234/v1 替换了 http://localhost:1234/v1

It works!

new4u avatar Mar 24 '24 05:03 new4u

@KennyVaneetvelde Hi, sounds great! Where did you get the docker image? I wasn't able to find one in dockerhub, and I wouldn't know how to build it. Thank you!!

RicardoFernandez-UY avatar Apr 05 '24 12:04 RicardoFernandez-UY

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用 http://host.docker.internal:1234/v1 替换了 http://localhost:1234/v1

It works!

I tried this but unfortunately it didn't work. What did work was changing the localhost url to http://172.17.0.1:1234/v1 and it worked like a charm! Note that I'm using flowise in Docker and LM studio locally.

rachdeg avatar May 03 '24 22:05 rachdeg

Hi, Try tried the idea "KennyVaneetvelde commented on Feb 2 •", but the chat window is not rolling the message. The replay of the LLM in LLMStudio appears at ones. Although in LM studio I see the repons building up. What can be wrong in my Flowise diagram. image

jan-wijman avatar Jul 18 '24 11:07 jan-wijman