Error streaming diff: TypeError: templateMessages is not a function
Before submitting your bug report
- [x] I believe this is a bug. I'll try to join the Continue Discord for questions
- [x] I'm not able to find an open issue that reports the same bug
- [x] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11
- Continue: v0.9.211 (pre-release)
- IDE: VSCODE
- Model: claude-3.5-sonnet (Openrouter)
- config.json:
{
"models": [
{
"title": "Anthropic: Claude 3.5 Sonnet",
"provider": "openrouter",
"model": "anthropic/claude-3.5-sonnet",
"contextLength": 200000,
"apiBase": "https://openrouter.ai/api/v1",
"apiKey": "xxxxxx",
"requestOptions": {
"extraBodyProperties": {
"transforms": []
}
}
}
],
"customCommands": [
{
"name": "test",
"prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
"description": "Write unit tests for highlighted code"
}
],
"tabAutocompleteModel": {
"title": "Starcoder2 3b",
"provider": "ollama",
"model": "starcoder2:3b"
},
"contextProviders": [
{
"name": "code",
"params": {}
},
{
"name": "docs",
"params": {}
},
{
"name": "diff",
"params": {}
},
{
"name": "terminal",
"params": {}
},
{
"name": "problems",
"params": {}
},
{
"name": "folder",
"params": {}
},
{
"name": "codebase",
"params": {}
}
],
"slashCommands": [
{
"name": "edit",
"description": "Edit selected code"
},
{
"name": "comment",
"description": "Write comments for the selected code"
},
{
"name": "share",
"description": "Export the current chat session to markdown"
},
{
"name": "cmd",
"description": "Generate a shell command"
},
{
"name": "commit",
"description": "Generate a git commit message"
}
]
}
Description
Cannot get code Edit to work. (Ctrl+I)
Error streaming diff: TypeError: templateMessages is not a function
To reproduce
- Select Code
- (Ctrl+I)
- Any instruction inside dialog
- Error message
Log output
No response
Experiencing the same issue with release version 0.8.52, VS Code, same config.json, MacOS.
Update from Discord:
- switching
"provider": "openrouter"to"provider": "openai"fixes the issue - there is an unreleased fix - not sure if this would allow "openrouter" to be used as a value
I also got this error while using:
{
"title": "Claude 3.5 Sonnet",
"model": "anthropic/claude-3.5-sonnet:beta",
"cacheBehavior": {
"cacheSystemMessage": true,
"cacheConversation": true
},
"provider": "openrouter",
"apiBase": "https://openrouter.ai/api/v1",
"apiKey": "..."
},
Switching the provider from openrouter to anthropic does fix this issue, but caused others:
- Chat: stopped working. When I hit enter, it starts "thinking", but then stops, outputs nothing, and creates a second prompt below.
- Editor: Doesn't fail right away, but now the only output I get are deleted lines.
self.is_initialized = True
I did Ctrl+I on this code and asked: set to false. Instead, it suggested a full line deletion. Also happens with prompts applied on bigger blocks of text/whole file, the whole inputted code gets proposed to be deleted. Also, when trying to apply code suggestions from the Chat, it deleted the whole file.
Any update on this? it's really annoying not to be able to use the Ctrl+I features, have to resort to Cursor for these tasks
same happens for me. this is my config:
{
"models": [
{
"title": "openai/gpt-4o-mini",
"provider": "openrouter",
"model": "openai/gpt-4o-mini",
"apiBase": "https://openrouter.ai/api/v1",
"apiKey": "REDACTED"
},
{
"title": "anthropic/claude-3.5-sonnet",
"provider": "openrouter",
"model": "anthropic/claude-3.5-sonnet",
"apiBase": "https://openrouter.ai/api/v1",
"apiKey": "REDACTED"
}
],
"tabAutocompleteModel": {
"title": "OpenAI O mini",
"provider": "openrouter",
"model": "openai/gpt-4o-mini",
"apiBase": "https://openrouter.ai/api/v1",
"apiKey": "REDACTED"
},
"customCommands": [
{
"name": "test",
"prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
"description": "Write unit tests for highlighted code"
}
],
"contextProviders": [
{
"name": "code",
"params": {}
},
{
"name": "docs",
"params": {}
},
{
"name": "diff",
"params": {}
},
{
"name": "terminal",
"params": {}
},
{
"name": "problems",
"params": {}
},
{
"name": "folder",
"params": {}
},
{
"name": "codebase",
"params": {}
}
],
"slashCommands": [
{
"name": "share",
"description": "Export the current chat session to markdown"
},
{
"name": "cmd",
"description": "Generate a shell command"
},
{
"name": "commit",
"description": "Generate a git commit message"
}
]
}
Any update on this? It's really cumbersome having to switch to Cursor every time I need to use the Quick Edits (Ctrl+I)
changing "provider": "openrouter" to "provider": "openai" seems to work
It's not really a workaround since I want to use open router
It's not really a workaround since I want to use open router
You can use openrouter with its openai compatible api:
"provider": "openai",
"apiBase": "https://openrouter.ai/api/v1",
changing
"provider": "openrouter"to"provider": "openai"seems to work
@ilyasofficial1617 I had tried that before and I think it made Quick Edits work, but I remember it broke something else in Continue, don't remember what, but made me revert the fix. Will try again.
I am also having this issue. VS Code. Latest version of the extension. I tried changing it to "provider": "openai" as suggested, and that seemed to help, but when I apply the change it actually adds the new code to the end of the file instead of updating the function that was changed.
I also have this error with a local model (via LM Studio preset). I can post a request to a regular chat window, but once I try to use CTRL+L or CTRL+I option, I get the: error streaming diff
It would still be good to have the "openrouter" provider working though. I imagine the "openrouter" provider is necessary in order to, for each model it provides, we can select the model-provider (such as DeepInfra/Together/Fireworks/etc for Deepseek-R1 model)?
I also have this error with a local model (via LM Studio preset). I can post a request to a regular chat window, but once I try to use CTRL+L or CTRL+I option, I get the: error streaming diff
Meet with same error, but a little different: you must provide a messages parameter
It's not really a workaround since I want to use open router
You can use openrouter with its openai compatible api:
"provider": "openai", "apiBase": "https://openrouter.ai/api/v1",
This appears to work for me, I had to copy a model that I would normally use and change that that and name it Hack just to use it.
Does this allow passing OpenRouter parameters, such as preferred providers?
Same issue, my error msg is:
Error streaming diff: Error: JSON.parse(...)?.error?.replace is not a function
config.yaml:
name: Local Assistant
version: 1.0.0
schema: v1
models:
- name: Text Embedding-3 Large
provider: openai
model: text-embedding-3-large
apiBase: https://<MyAzureOpenAI>.openai.azure.com/openai/deployments/text-embedding-3-large/embeddings?api-version=2023-05-15
apiKey: <MyApiKey>
env:
apiVersion: 2023-05-15
deployment: text-embedding-3-large
apiType: azure-openai
context:
- provider: code
- provider: docs
- provider: diff
- provider: terminal
- provider: problems
- provider: folder
- provider: codebase
This issue hasn't been updated in 90 days and will be closed after an additional 10 days without activity. If it's still important, please leave a comment and share any new information that would help us address the issue.
This issue was closed because it wasn't updated for 10 days after being marked stale. If it's still important, please reopen + comment and we'll gladly take another look!