How to enable Live Streaming over Direct Line for a Copilot Studio–deployed agent?
I have a question
I’m connecting a Copilot Studio–deployed agent to Web Chat via Direct Line (same environment as the sample below). The docs suggest implementing live streaming on the bot side, but in my case the app is built with Copilot Studio, not a custom Bot Framework bot.
Question:
Is there a way to enable live streaming (token-by-token / incremental responses) when using a Copilot Studio agent over Direct Line?
If yes, what settings or configuration are required (e.g., feature flags in Copilot Studio, Direct Line/Web Chat options, or server-side capability toggles)?
Currently, using the sample setup, responses are not streamed; they appear only after completion.
Sample I’m following: 👉 https://github.com/microsoft/Agents/tree/main/samples/nodejs/copilotstudio-webchat-react
✅ Expected Behavior
Responses from the agent should stream progressively (e.g., partial tokens/segments) in Web Chat when connected through Direct Line.
🧱 Actual Behavior
Responses render only after the entire message is generated—no incremental streaming observed.
🔁 Steps to Reproduce
Deploy an agent using Copilot Studio.
Connect the agent to a React Web Chat app via Direct Line using the sample linked above.
Send a prompt that usually takes a few seconds to answer.
Observe that the message renders only when complete (no streaming effect).
🖥️ Environment
Copilot Studio Agent: built in Copilot Studio (no custom Bot Framework code)
Connection: Direct Line → Web Chat (React)
Sample: copilotstudio-webchat-react
Packages:
{ "@azure/msal-browser": "^4.13.1", "@microsoft/agents-copilotstudio-client": "^1.0.15", "botframework-webchat": "4.18.1-main.20250912.cbaf98f", "botframework-webchat-fluent-theme": "4.18.1-main.20250912.cbaf98f", "react": "16.8.6", "react-dom": "16.8.6" }
Browser: Chrome (latest)
Region/Tenant: (add if relevant)
❓Questions / Clarifications
Does Copilot Studio support server-driven streaming when used with Direct Line (not Direct Line Speech)?
If supported, how do we enable it? Any required Copilot Studio settings, Direct Line flags, or Web Chat props to turn on streaming?
If not supported today, is there a recommended workaround (e.g., custom bot handoff, different channel, or preview feature) or any ETA?
I am also facing same issue
@compulim
@benbrown, @MattB-msft do you have a way to enable streamin in @microsoft/agents-copilotstudio-client?
Web Chat supports streaming, but it has to be supported on the adapter level as well.
The latest version of the sample and the copilotstudio-client package support streaming.
However, there was a change in the underlying service behavior that has not yet been addressed in the SDK that would make it appear that streaming was not working this way. A patch for the SDK is underway around this.
https://github.com/microsoft/Agents-for-js/issues/820
I am also facing this issue.
Thanks for your work on this so far @benbrown. I've been following your work on streaming behaviour, the latest being https://github.com/microsoft/Agents-for-js/pull/828.
I've tried the latest npm package 1.1.4-g8d884129e7 for @microsoft/agents-copilotstudio-client which I believe contains your latest streaming fixes.
FYI, I'm using <BasicWebChat> via directline connection from a Copilot Studio bot, and messages still only render when complete (no streaming effect).
Is there any particular setting I need to change, on the Copilot Studio side, or in @microsoft/agents-copilotstudio-client?
Update: even the CopilotStudioChatAPI example, which claims it showcases support for "streaming responses (ChatGPT style)", doesn't work.
Looking at the response activity when I send a message, Copilot Studio is sending an activity event that is still just a big type: message. And I see no ability to tell Copilot Studio to change its sending behaviour to stream.