bozo32

Results 18 comments of bozo32

Please allow reporting by or across sources. By would permit iteration through a set of sources.

Thank you I’ll try again. From: wwoodsTM ***@***.***> Reply to: ollama/ollama ***@***.***> Date: Tuesday, 23 April 2024 at 05:02 To: ollama/ollama ***@***.***> Cc: peter tamas ***@***.***>, Author ***@***.***> Subject: Re:...

is there a way to iterate through rather than aggregate across a collections? where we can have the LLM answer by data source? This may be useful for comparison between...

notme but, for your info there is a bunch of stuff in command-r that would be phenomenal to support https://github.com/cohere-ai/notebooks/blob/main/notebooks/Vanilla_RAG.ipynb they seem to have tweaked their embedding model to fit...

adding it to the RAG template would be good. seems that the parameter to pull context/quotations may vary by LLM (e.g. command-r has their own special approach) this may need...

The last comment about reporting by document or source versus across documents or source. Sometimes when we’re doing rag or question answer with documents, we actually don’t want to pull...

Hi For people who are building very specific rag setups, something like flowise may be better. These (langflow/flowise) allow granular control and a really idiot friendly way to see what...

There is no guide for integration with openwebui nor should there be?....both flowise and langflow allow you to drop quite a number of different LLMs into the flows. One option,...

will this permit choice of index? https://docs.llamaindex.ai/en/stable/module_guides/indexing/index_guide/ not sure, but for data sources that are somewhat internally structured (e.g. academic papers), when iterating through data sources the tree structure may...

as mentioned in reddit, if the code was made a bit more idiot friendly, I'd be happy to chew on making something that worked on my m1.