-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Open
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:windowsHappening specifically on WindowsHappening specifically on Windows
Description
Before submitting your bug report
- I've tried using the "Ask AI" feature on the Continue docs site to see if the docs have an answer
- I'm not able to find a related conversation on GitHub discussions that reports the same bug
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11 pro
- Continue version: 1.2.16
- IDE version: 1.106.3
- Model: -
- config:
- name: -
provider: openai
model: qwen2.5-coder:7b
capabilities: [tool_use]
apiBase: https://......../v1
apiKey: ....
roles:
- chat
- edit
- autocomplete
- apply
- summarize
defaultCompletionOptions:
contextLength: 6144
maxTokens: 2048
requestOptions:
verifySsl: falseDescription
I have an API that uses a self-signed certificate, and I have already installed the root certificate of this certificate on the Windows system.
However, Node.js does not seem to want to use the system certificates for some reason.... When I send a request to the API, an error is thrown and the API cannot be used normally.
I have checked the logs, including the developer debug window in VS Code (i.e., the JS logs), but there is no more useful information available—only one message: "Connect Error." That’s all there is.
Nevertheless, the API works fine when I send HTTP requests. Once I switch to HTTPS (TLS/SSL), an error is triggered.
I have tried the following solutions:
- Setting system environment variables to force Node.js to use system certificates
- Specifying the path to system certificates for Node.js (via configurations in VS Code)
- Disabling SSL verification in the configuration file of Continue
- Replacing the OpenAI interface with the Ollama interface (a server I deployed myself, with Ollama as the backend)—this still didn’t work, though the error message changed to: "Failed to verify the root certificate of the SSL certificate," even after I tried the above steps.
- Directly requesting the same URL via curl works normally, so there is no issue with the API backend.
To reproduce
- Deploy an OpenAI-style API with a self-signed certificate
- Configure Continue
- Attempt to have a conversation (with the API)
Log output
[Extension Host] [@continuedev] error: Connection error. {"context":"llm_stream_chat","model":"qwen2.5-coder:7b","provider":"openai","useOpenAIAdapter":true,"streamEnabled":true,"templateMessages":false}
workbench.desktop.main.js:sourcemap:528 [Extension Host] Error: Connection error.
at OpenAI.makeRequest (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:177721:17)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async OpenAIApi.chatCompletionStream (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:181539:26)
at async OpenAI2.openAIAdapterStream (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:304704:26)
at async OpenAI2.streamChat (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:304840:32)
at async llmStreamChat (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:816016:19)
at async Wd.handleMessage [as value] (c:\Users\ngc13\.vscode\extensions\continue.continue-1.2.16-win32-x64\out\extension.js:854405:29)
workbench.desktop.main.js:sourcemap:36 ERR [Extension Host] Error handling webview message: {
"msg": {
"messageId": "70740b59-a098-41f7-87a6-bf1217fb07d7",
"messageType": "llm/streamChat",
"data": {
"completionOptions": {
"reasoning": false
},
"title": "LLLMqwen",
"messages": [
{
"role": "system",
"content": "....."
},
{
"role": "user",
"content": "hello"
}
],
"messageOptions": {
"precompiled": true
}
}
}
}
Error: Connection error.
error @ workbench.desktop.main.js:sourcemap:36
error @ workbench.desktop.main.js:sourcemap:36
error @ workbench.desktop.main.js:sourcemap:3828
mCs @ workbench.desktop.main.js:sourcemap:528
$logExtensionHostMessage @ workbench.desktop.main.js:sourcemap:528
S @ workbench.desktop.main.js:sourcemap:3869
Q @ workbench.desktop.main.js:sourcemap:3869
M @ workbench.desktop.main.js:sourcemap:3869
L @ workbench.desktop.main.js:sourcemap:3869
(匿名) @ workbench.desktop.main.js:sourcemap:3869
C @ workbench.desktop.main.js:sourcemap:29
fire @ workbench.desktop.main.js:sourcemap:29
fire @ workbench.desktop.main.js:sourcemap:561
l.onmessage @ workbench.desktop.main.js:sourcemap:3880
workbench.desktop.main.js:sourcemap:528 [Extension Host] Error handling webview message: {
"msg": {
"messageId": "70740b59-a098-41f7-87a6-bf1217fb07d7",
"messageType": "llm/streamChat",
"data": {
"completionOptions": {
"reasoning": false
},
"title": "LLLMqwen",
"messages": [
{
"role": "system",
"content": "....."
},
{
"role": "user",
"content": "hello"
}
],
"messageOptions": {
"precompiled": true
}
}
}
}
Error: Connection error.
gCs @ workbench.desktop.main.js:sourcemap:528
$logExtensionHostMessage @ workbench.desktop.main.js:sourcemap:528
S @ workbench.desktop.main.js:sourcemap:3869
Q @ workbench.desktop.main.js:sourcemap:3869
M @ workbench.desktop.main.js:sourcemap:3869
L @ workbench.desktop.main.js:sourcemap:3869
(匿名) @ workbench.desktop.main.js:sourcemap:3869
C @ workbench.desktop.main.js:sourcemap:29
fire @ workbench.desktop.main.js:sourcemap:29
fire @ workbench.desktop.main.js:sourcemap:561
l.onmessage @ workbench.desktop.main.js:sourcemap:3880
workbench.desktop.main.js:sourcemap:3880 Extension Host
workbench.desktop.main.js:sourcemap:3880 [@continuedev] error: Connection error. {"context":"llm_stream_chat","model":"qwen2.5-coder:7b","provider":"openai","useOpenAIAdapter":true,"streamEnabled":true,"templateMessages":false}Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:windowsHappening specifically on WindowsHappening specifically on Windows
Type
Projects
Status
Todo