Jamie Coombes
Jamie Coombes
Note this would also be possible by defining a callback function that splits the text into sections by role using a regex and color codes each section,
Here's the new code... ```mojo from python import Python fn main() raises: let pt = Python.import_module("torch") # Print CUDA details print("CUDA available:" + pt.cuda.is_available()) print("CUDA device count:" + pt.cuda.device_count()) #...
It's a stopgap, but I've naively updated the chromadb and the embedchain. and Memory seems to work with the ollama provider now, I'm currently taking a look at making the...
As a stopgap, can this be fixed by altering the stop sequence expected by the ollama MODELFILE?
