pipelines
pipelines copied to clipboard
enh: Support for monitoring /chat/completions endpoint of OpenWebUI
Need support for logging requests on http://localhost:3000/api/chat/completions with Langfuse. This could be helpful for users who are using owui with API along with the front-end.
I see that outlet filter is being triggered when calling from WebUI but not when calling through API. Is this expected or do I have to change things in pipeline file to trigger outlet?
It would be cool if it can be possible to launch a chat completion using a question with a model inside a pipeline? why? to automate ragas evaluation through langfuse
See: https://github.com/open-webui/open-webui/issues/3237