[Feature Request] Get the input of chat/prompt of LLM node
Is your feature request related to a problem? Please describe. Want to record the prompt and LLM settings for different runs in my DB. However, found promptflow didn't support any hooks in LLM node to retrieve the input before running or method to retrieve complete prompt after running.
Describe the solution you'd like A hook before running LLM node or a method to retrive the prompt after running
Describe alternatives you've considered Make custom tool to connect LLM? or formulate prompt before input to LLM node. For this, i need to give up using those useful features such as variants.
Additional context Add any other context or screenshots about the feature request here.
+++ This can be very helpful.
In addition, I think we need application integration or an internal tool to receive and work with traces.
Appreciate your feedback @sparkdemotivator @ArtyomZemlyak! These LLM settings and metrics from LLM api request/response are now exposing in the trace log of each node run. actually, you're not alone in this request, as other users have expressed similar needs. We've taken note of this and have included it as a new feature request on our development roadmap.