vscode-copilot-release icon indicating copy to clipboard operation
vscode-copilot-release copied to clipboard

Context should try harder before saying "Sorry, but I can only assist with programming related questions."

Open potatoqualitee opened this issue 2 years ago • 14 comments

  • VS Code Version: Latest Stable
  • OS Version: Mac latest beta

Steps to Reproduce:

  1. Write a chapter in a book titled "AI for IT", Chapter 5 "Reports and Documentation" in Asciidoc (.adoc) format
  2. Auto complete a few paragraphs with inline Copilot
  3. Ask Copilot Chat "what should i write next? give me an outline and the conclusion paragraph for this section"
  4. Copilot Chat says "Sorry I can't help you with that. I can only help with programming"
  5. Tell it "But im writing an AI book for IT pros"
  6. Watch as it outlines the chapter

Thank you 🙇🏼

potatoqualitee avatar Dec 16 '23 15:12 potatoqualitee

Just a note that I think the "Sorry, but I can only assist with programming related questions." should be relaxed a bit, since not everything we do in VS code is programming, sometimes it's ui copy, translation resources, etc.

Sometimes I use co-pilot to help me with suggestions for UI copy (eg. when text is too long for a certain UI element, I ask co-pilot to make it shorter, and give it feedback like "can you make it more friendly"). It's super helpful, especially for people who are not native speakers of a language to write good UI copy

I got the "Sorry I can't help you with that. I can only help with programming" for the first time today.

Using this trick works, I just need to say "yeah, but I'm writing a copy for a UI in react native" and it will gladly output the answer, but I would urge the co-pilot team to broaden the restriction to not only programming, but other programming and product building tasks :)

sergioisidoro avatar Jan 16 '24 08:01 sergioisidoro

@roblourens just saw your off topic label. am I in the wrong repo?

potatoqualitee avatar Jan 30 '24 04:01 potatoqualitee

No, the issue is about our overactive off-topic filter :)

roblourens avatar Jan 30 '24 19:01 roblourens

Another example: Q: What are other terms like "tech debt" to describe code that is work in progress, still working on it and needs polish? A: Sorry, but I can only assist with programming related questions. Q: tech debt is programming related A: Sorry, but I can only assist with programming related questions.

neo-anderson avatar Mar 14 '24 01:03 neo-anderson

03 May 2024. This is still an Issue.

image

dpk899 avatar May 03 '24 06:05 dpk899

Why copilot? image

Faazlaeeq avatar Jul 01 '24 10:07 Faazlaeeq

This happens more and more. Are we supposed to pay more?

fgaletic avatar Jul 07 '24 18:07 fgaletic

Same here... and my questions are definetly code related :-D

Multisaft7 avatar Jul 16 '24 12:07 Multisaft7

I've been suffering with this more and more too. I ask it to literally generate some code, and I get "Sorry, but I can only assist with programming related questions." back. And then, if I only answer with something like "yes you can" it generates what I asked for. I'm practically having an aneurysm trying to make it be useful. It's rather quick to just . . . not do what it was created for, even if the question or prompt is or isn't programming related. Whether it is or isn't should be up to us, not to an arbitrary ruling or weights in the model. Not when it clearly is not up to the task.

frigvid avatar Jul 17 '24 12:07 frigvid

Same here, with code related questions...

luismvargasg avatar Jul 17 '24 17:07 luismvargasg

I've been suffering with this more and more too. I ask it to literally generate some code, and I get "Sorry, but I can only assist with programming related questions." back. And then, if I only answer with something like "yes you can" it generates what I asked for. I'm practically having an aneurysm trying to make it be useful. It's rather quick to just . . . not do what it was created for, even if the question or prompt is or isn't programming related. Whether it is or isn't should be up to us, not to an arbitrary ruling or weights in the model. Not when it clearly is not up to the task.

It's crazy how many times I had to say "yes you can" to it and then it does what it's supposed to. Super annoying.

fgaletic avatar Jul 17 '24 19:07 fgaletic

This does not solve the problem, but it is a workaround: I created a GitHub Copilot Chat Extension that allows you to use the OpenAI (Azure or openai.com) API within GitHub Copilot Chat.

https://marketplace.visualstudio.com/items?itemName=chrissylemaire.assistants-chat-extension

image

If you don't already have an assistant setup, it'll offer to setup a Beavis and Butthead assistant which demonstrates that there really aren't any restrictions on the chat output.

image

Note that this does cost API tokens.

potatoqualitee avatar Jul 18 '24 16:07 potatoqualitee

@potatoqualitee thank you, I got stuck at this step:

"In the search bar, type "Assistants Chat Extension". Couldn't find it.

Anyway, do we know anything about a recent update that made the chat worse? It's incredible how uncooperative and generic it's been lately.

fgaletic avatar Jul 19 '24 13:07 fgaletic

@fgaletic ah, ive renamed it since and will update the docs, thank you. You can just skip that step, it'll auto-detect the missing key and prompt you to add either openai or azure openai info.

potatoqualitee avatar Jul 19 '24 16:07 potatoqualitee

We updated the off-topic filter and see a significant drop in false-positive filtering in experiment results.

Closing as fixed, but please don't hesitate to file new issues in case you still run into this.

digitarald avatar Aug 16 '24 16:08 digitarald

thank you! 🙏🏼

potatoqualitee avatar Aug 22 '24 07:08 potatoqualitee