lelapin123

Results 13 comments of lelapin123

Alpaca Electron is very decent with 32gb of ram (not vram) Congrats for your A4000 at $10000/ut

it seems related to the model

So what settings (and databases) should you use to ask questions about a text? edit: I notice a "summary" mode btw

The answers are pretty accurate (on one file that is a bit tricky to understand), but it is slower that privategpt

> Are you running it on a GPU? on windows: yes, i can see it Unless it loads in the GPU to actually use the CPU

Hardware is 3090 basic Cuda is 11.7 Look, i watched your video: Do you really need the visual studio environment to make this work ? I notice that the loading...

Huggingface stores its model here: C:\Users\username\\.cache\huggingface\hub and my C drive is SSD too

i fixed the issue with a: pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu117 source: https://pytorch.org/get-started/locally/ but it gave me an error later: ERROR: pip's dependency resolver does not currently take...

you need to add force reinstall as option: --force-reinstall