adampolak-vertex

Results 7 comments of adampolak-vertex

> Hi @adampolak-vertex , thanks for reaching out! > > Based on my understanding of trace and open telemetry, when the endpoint returns response, the corresponding trace/span is not close...

> Hi @adampolak-vertex, instead of sending the trace back as part of the response, is it possible that you implement your own trace exporter which directly send the trace to...

> To point traces to different server, you can use OTel environment variable `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT` to do so, PF will honor this setting and export traces to that endpoint - this...

> Meanwhile, token should be available, but it's different from tracing. Providing token usage in endpoint response should be a separate topic from prompt flow; tagging Xiaopeng @wxpjimmy to add...

This is important. Is there a solution for this? We cannot use PromptFlow as an endpoint if it cannot do bulk processing.

> Hi, @adampolak-vertex thanks for reporting this. Currently, promptflow only supports bulk run inputs with each input as a single llm call. To aggregate multiple inputs as one in the...

> Yeah the problem you mentioned is true, in addition I'd like to mentioned that, if we support batch inputs aggregated in one prompt and send the call, then to...