Davide Guidotti
Davide Guidotti
@Boostrix i really like the idea. And my PR pretty much already covers it
> Would this also handle the current issue where the GPT4 default fails for people who only have access to GPT3.5 (e.g. #4229) ? Or more generally: how does this...
Recently PyTorch changed it's install command, it now uses --index-url instead of --extra-index-url, as mentioned in #9483 Also, i noticed your code doesn't cover AMD cards (in that cate TORCH_COMMAND...
> rocm 5.6.0 alpha is out and it brings torch 2.0 compatibility, i'd be curious if that works. Really? Where? I see 5.4.3 as the last realease on https://github.com/RadeonOpenCompute/ROCm/releases
> > > rocm 5.6.0 alpha is out and it brings torch 2.0 compatibility, i'd be curious if that works. > > > > > > Really? Where? I see...
Agreed. I also made a PR for setting LLM_BASE_URL with make setup-config. Should be enough for any OpenAI-compatible https://github.com/OpenDevin/OpenDevin/pull/616 For example, it works with Koboldcpp by setting it like this:...
> Does anything else than local requires a base url? I think LiteLLM forwards the others according to the model. > > If so, it might be worth to say...
> That's great, is there any hint to tell the user what form BASE_URL should take? Frankly, I was quite confused at first as to whether it should be https://api.openai.com/v1/...
I have the same problem on 5700XT using rocm 5.4.2 and pytorch 2.0 Strangely, it works fine using pytorch 1.13.1 same issue with both --medvram and --lowvram With pytorch 2...
> So cool. Auto-GPT just works with local model on text-generation-webui out of the box: Run matatonic/text-generation-webui server with --openai Run Gdev91/Auto-GPT with OPENAI_API_BASE_URL=http://127.0.0.1:5001/ in .env I used EMBED_DIM=5120 in...