Jan Bours
Jan Bours
> It ran I believe but the model is just very dumb Couldn't we use a better Language Model ? There are a lot of them around these days, their...
> Absolutely, you can plug in and play! Just input your own model and tokenizer! I'm here for you if you seek me I will do that ! But at...
> I think it would also be beneficial to have a separate category of tools that work with a local llm. Also a category for tools that require keys. >...
> Hi thanks for the ollama integration for local models . I was wondering if it's possible to use local models in the colab environment? The reason being not everyone...
> Hey @jbdatascience I think it's OK to close this? I think you could do it through Docker Spaces, but I'm not sure if that's the best way to do...
I installed it like this: - https://github.com/LOVE-DOCTOR/MultiTrain#installation !pip install MultiTrain - Successfully installed MultiTrain-0.1.30 catboost-1.1 pyaml-21.10.1 scikit-optimize-0.9.0 And after that I did: - If you experience issues or come across...
> Yes > > Just create single pinpongs (list) Then share it with two different prompt objects Interesting! Could you show a piece of Python could to do just that...
Thank you. I will try this out and see where it takes me!
I have no particular use-case in mind, but I will search for one, that hopefully makes it possible to make comparisons with traditional FA algorithms. The criterion for deciding which...
Interesting work! I would like to try your code out on my windows PC, but I have no experience with Docker, which is necessary if I understood you well. Could...