Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Freezing up #6089

Open
5T3N82 opened this issue Nov 7, 2024 · 1 comment
Open

Freezing up #6089

5T3N82 opened this issue Nov 7, 2024 · 1 comment
Labels
bug Something isn't working repo/cody

Comments

@5T3N82
Copy link

5T3N82 commented Nov 7, 2024

Type: Bug

Extension Information

  • Cody Version: 1.40.2
  • VS Code Version: 1.95.1
  • Extension Host: desktop

Steps to Reproduce

  1. Hangs when needing to enter a prompt, unable to type.
  2. Possibly due to the length of the context

Expected Behaviour

Logs

█ telemetry-v2 recordEvent: cody.extension/savedLogin:
█ auth Authenticating to https://sourcegraph.com/...:
█ ModelsService User model preferences changed: {"defaults":{"chat":"anthropic::2023-06-01::claude-3.5-sonnet","edit":"anthropic::2023-06-01::claude-3.5-sonnet","autocomplete":"fireworks::v1::deepseek-coder-v2-lite-base"},"selected":{}}
█ auth Authentication succeed to endpoint https://sourcegraph.com/:
█ telemetry-v2 recordEvent: cody.auth/connected:
█ ClientConfigSingleton refreshing configuration:
█ GraphQLTelemetryExporter evaluated export mode:: 5.2.5+
█ ChatsController:constructor init:
█ SymfRunner unsafeEnsureIndex: file:///c%3A/Users/dafor/Documents/Coding/SpellMagic/Spell%20magic%20v2/modules
█ GraphQLTelemetryExporter evaluated export mode:: 5.2.5+
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ModelsService new models API enabled:
█ ModelsService ModelsData changed: 12 primary models
█ Autocomplete:initialized using "dotcom-feature-flags": "fireworks::deepseek-coder-v2-lite-base":
█ UpstreamHealth Ping took 202ms (Gateway: 194ms):
█ ChatController updateViewConfig:
█ ChatController updateViewConfig:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1258 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"9840ms" stopReason:"undefined" outputChannelId:"b963a9b5-4793-43ac-a947-83ae2c69d88c":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1258 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"16208ms" stopReason:"undefined" outputChannelId:"1bef30d5-c6b7-4ad7-8e71-3814cc891fa6":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1260 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onError duration:"11531ms" endpoint:"https://sourcegraph.com/.api/completions/stream?api-version=2&client-name=vscode&client-version=1.40.2" outputChannelId:"b3e8244c-0a86-4626-b455-69a305c2e68f": {"outputChannelId":"b3e8244c-0a86-4626-b455-69a305c2e68f","duration":11531,"err":"received no parseable response data from Anthropic"}
█ ChatController: postError received no parseable response data from Anthropic:
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ telemetry-v2 recordEvent: cody.editChatButton/clicked:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1260 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"13627ms" stopReason:"undefined" outputChannelId:"87838ccb-ab93-4dfb-9bb1-0ca6a7590427":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1262 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"9694ms" stopReason:"undefined" outputChannelId:"a5a3c3bb-a398-4d00-bcbb-886af3d2e7de":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "8640ae81-2774-47b3-b8a3-4ca261f60b0b":
█ Autocomplete:requestManager Irrelevant request aborted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"649ms" stopReason:"cody-request-aborted" outputChannelId:"8640ae81-2774-47b3-b8a3-4ca261f60b0b":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ UpstreamHealth Ping took 202ms (Gateway: 223ms):
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1264 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"11747ms" stopReason:"undefined" outputChannelId:"e60ca5bc-1bf2-49ef-870e-e1dbb977e07b":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1264 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"9792ms" stopReason:"undefined" outputChannelId:"7215a6f4-b19d-442e-b13d-413c49b1c9cd":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1266 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"12234ms" stopReason:"undefined" outputChannelId:"4b75de25-23f7-43b4-9a70-bd910cca2a40":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1266 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"15222ms" stopReason:"undefined" outputChannelId:"8806c6b2-b2c1-416b-b28d-db33e7e5832e":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "eb1f6357-952f-4aaf-9d14-061b21b69e55":
█ Autocomplete:requestManager Irrelevant request aborted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"587ms" stopReason:"cody-request-aborted" outputChannelId:"eb1f6357-952f-4aaf-9d14-061b21b69e55":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "c4646a96-c405-472b-b588-a2470f5ee753":
█ Autocomplete:onComplete duration:"965ms" stopReason:"cody-request-aborted" outputChannelId:"c4646a96-c405-472b-b588-a2470f5ee753":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1266 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"10572ms" stopReason:"undefined" outputChannelId:"480d4e29-7f87-4d5c-86db-12443abd9a2c":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "f0237e2b-2567-4b89-ac94-bdccae901238":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "aae0ebb7-18cd-4f40-ae3e-550a8e1383c9":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:onComplete duration:"628ms" stopReason:"stop" outputChannelId:"f0237e2b-2567-4b89-ac94-bdccae901238":
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "4610992d-f569-46cc-8642-73f11d472f4a":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "52b0e177-c66f-4c5d-a4df-acd1b279cdb6":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "acb3297d-29db-4ef1-a424-16dc8c52a252":
█ Autocomplete:requestManager Irrelevant request aborted:
█ Autocomplete:onComplete duration:"1134ms" stopReason:"cody-request-aborted" outputChannelId:"52b0e177-c66f-4c5d-a4df-acd1b279cdb6":
█ Autocomplete:requestManager Irrelevant request aborted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"597ms" stopReason:"stop" outputChannelId:"acb3297d-29db-4ef1-a424-16dc8c52a252":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ UpstreamHealth Ping took 183ms (Gateway: 205ms):
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1266 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"8272ms" stopReason:"undefined" outputChannelId:"d7e2e0a0-8f62-43c4-9a29-13d80960c8e4":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "df980238-3077-4337-9e62-7eaab7c3491b":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "71ab9d03-a271-4055-916d-ed51bd2145f1":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "bd9e83cf-5bff-40b5-b533-a18dc0ca751f":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:onComplete duration:"1139ms" stopReason:"cody-streaming-chunk" outputChannelId:"71ab9d03-a271-4055-916d-ed51bd2145f1":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1266 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"10031ms" stopReason:"undefined" outputChannelId:"69c87b60-ec0b-478b-8b35-cbccc7a9e785":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1270 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"10159ms" stopReason:"undefined" outputChannelId:"80ea8505-f05c-4f5d-be7a-a901ac2f7ba0":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "45fa7229-5261-440c-8cb0-25a940fbc456":
█ Autocomplete:onComplete duration:"632ms" stopReason:"cody-request-aborted" outputChannelId:"45fa7229-5261-440c-8cb0-25a940fbc456":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "5004c816-faa8-4b69-97a4-fadfe90aae56":
█ Autocomplete:requestManager Irrelevant request aborted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"623ms" stopReason:"stop" outputChannelId:"5004c816-faa8-4b69-97a4-fadfe90aae56":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.completion.stageCounter/flush:
█ telemetry-v2 recordEvent: cody.characters/flush:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1274 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"9694ms" stopReason:"undefined" outputChannelId:"312155d2-6370-4d0a-9d38-e6370eaf03ea":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ UpstreamHealth Ping took 149ms (Gateway: 225ms):
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1274 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=16:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "e870ba7b-ac28-4c73-9881-5bfadd8d5849":
█ Autocomplete:onComplete duration:"11764ms" stopReason:"undefined" outputChannelId:"5c38fa8b-f7c9-41ba-8af5-3b210fd6c821":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ Autocomplete:onComplete duration:"1241ms" stopReason:"stop" outputChannelId:"e870ba7b-ac28-4c73-9881-5bfadd8d5849":
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor.toolbar.mention/click:
█ telemetry-v2 recordEvent: cody.at-mention/selected:
█ telemetry-v2 recordEvent: cody.at-mention/selected:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1278 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=10:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"12002ms" stopReason:"undefined" outputChannelId:"fbb7703d-0656-4ecb-b9ba-7b38748c115c":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1280 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"12748ms" stopReason:"undefined" outputChannelId:"389e0fe3-15f4-4b5a-8b6b-e923ea500beb":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1280 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"9618ms" stopReason:"undefined" outputChannelId:"a6ce3162-fc54-433a-9065-a0a693cc2fbe":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ UpstreamHealth Ping took 303ms (Gateway: 249ms):
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1280 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"8147ms" stopReason:"undefined" outputChannelId:"49b16f1c-bf68-4b29-a6fe-c56b87089a86":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1280 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"9333ms" stopReason:"undefined" outputChannelId:"da251ecc-090d-4a8d-9ea9-f24d22805ffc":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1282 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"8573ms" stopReason:"undefined" outputChannelId:"f4f87921-a870-4d8f-a32d-72ff46b30cc3":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1284 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"25577ms" stopReason:"undefined" outputChannelId:"06971bc6-fd0e-41cb-99b5-6a7f295a6068":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "059c037c-e4d2-4eb1-87a3-5b058d8f63b0":
█ Autocomplete:onComplete duration:"1024ms" stopReason:"cody-request-aborted" outputChannelId:"059c037c-e4d2-4eb1-87a3-5b058d8f63b0":
█ ClientConfigSingleton refreshing configuration:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "0cdd4f17-b19c-47b7-aab7-a512e3d18e94":
█ Autocomplete:requestManager Irrelevant request aborted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"689ms" stopReason:"cody-request-aborted" outputChannelId:"0cdd4f17-b19c-47b7-aab7-a512e3d18e94":
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "6e1d062d-86f8-4150-b92f-55233260e394":
█ Autocomplete:onComplete duration:"689ms" stopReason:"stop" outputChannelId:"6e1d062d-86f8-4150-b92f-55233260e394":
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "f8a4916b-fc06-479a-9e9d-2353dd4d9afc":
█ Autocomplete:onComplete duration:"608ms" stopReason:"cody-request-aborted" outputChannelId:"f8a4916b-fc06-479a-9e9d-2353dd4d9afc":
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "49d81728-6d26-4e59-9667-a9a632240cfa":
█ Autocomplete:onComplete duration:"603ms" stopReason:"stop" outputChannelId:"49d81728-6d26-4e59-9667-a9a632240cfa":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "1d5de760-937a-4f90-8513-1f772883bee6":
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"945ms" stopReason:"cody-streaming-chunk" outputChannelId:"1d5de760-937a-4f90-8513-1f772883bee6":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "090cb8fa-d4a1-44bc-99b7-14ff120e5308":
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"624ms" stopReason:"stop" outputChannelId:"090cb8fa-d4a1-44bc-99b7-14ff120e5308":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "ba325196-d0b7-48de-9dd4-af41318a233b":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "7b7e8868-0a5f-4791-ad2e-e7ea781f8458":
█ Autocomplete:onComplete duration:"558ms" stopReason:"stop" outputChannelId:"7b7e8868-0a5f-4791-ad2e-e7ea781f8458":
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ telemetry-v2 recordEvent: cody.completion/accepted:
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "382b9b27-3cd5-4007-b4b2-1dade98d8b45":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "d383c6bb-f3d0-43c3-b111-c2516e50072e":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:onComplete duration:"1272ms" stopReason:"stop" outputChannelId:"382b9b27-3cd5-4007-b4b2-1dade98d8b45":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "9e7d3ef2-d7a4-4c0e-8dd0-7a12ab55ebfc":
█ Autocomplete:onComplete duration:"626ms" stopReason:"stop" outputChannelId:"9e7d3ef2-d7a4-4c0e-8dd0-7a12ab55ebfc":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "e8616831-a68c-450c-939e-03ab6d61ee25":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "20cda2a0-e937-4a2a-bfe2-aed60476cb83":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "e78727bb-e3cb-49fe-a44d-899924806999":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "8175632a-9535-440d-8e30-a2144952d238":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:onComplete duration:"1042ms" stopReason:"stop" outputChannelId:"20cda2a0-e937-4a2a-bfe2-aed60476cb83":
█ ClientConfigSingleton refreshing configuration:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "9ab679a1-79f9-4e44-8c75-7544bb658a3d":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "391dd1c0-bd34-4809-9cda-28c049725268":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "a96bf984-65b7-41a7-a871-f51eab6c5b5a":
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ Autocomplete:onComplete duration:"295ms" stopReason:"stop" outputChannelId:"a96bf984-65b7-41a7-a871-f51eab6c5b5a":
█ UpstreamHealth Ping took 184ms (Gateway: 203ms):
█ telemetry-v2 recordEvent: cody.completion.persistence/present:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "7f0a47b2-ab7e-4dd3-8c72-badd80f7faca":
█ Autocomplete:onComplete duration:"587ms" stopReason:"stop" outputChannelId:"7f0a47b2-ab7e-4dd3-8c72-badd80f7faca":
█ telemetry-v2 recordEvent: cody.completion/suggested:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "60ecf6d3-8111-41cc-9a9b-ff93e7e77103":
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"576ms" stopReason:"stop" outputChannelId:"60ecf6d3-8111-41cc-9a9b-ff93e7e77103":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "90f7cca2-e04d-4cd3-8364-79cf95bcf412":
█ telemetry-v2 recordEvent: cody.completion/noResponse:
█ Autocomplete:onComplete duration:"792ms" stopReason:"cody-request-aborted" outputChannelId:"90f7cca2-e04d-4cd3-8364-79cf95bcf412":
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.completion.persistence/removed:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "c43320f1-892d-49bd-a468-d2cf31148930":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "8865b219-8b94-4823-88c3-df01e5d88718":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "a6229475-57d6-411f-bd62-b4f2e556c738":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "bd930814-288c-47e6-b582-311a21ac8d19":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "cf6f6606-9808-4c9e-987b-32cc852ff125":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "0f4f3e31-4ef5-4ecd-b9b5-d669026a7787":
█ Autocomplete:requestManager Irrelevant request aborted:
█ Autocomplete:onComplete duration:"1266ms" stopReason:"cody-request-aborted" outputChannelId:"8865b219-8b94-4823-88c3-df01e5d88718":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "25b09573-76c3-4983-abd0-a309a8fb0ec7":
█ Autocomplete:onComplete duration:"1057ms" stopReason:"stop" outputChannelId:"cf6f6606-9808-4c9e-987b-32cc852ff125":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "5d38daf7-3d15-490d-a8da-278a464493d0":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "2fd9a1df-727e-406b-9520-9cc3edaff370":
█ Autocomplete:requestManager Irrelevant request aborted:
█ Autocomplete:onComplete duration:"916ms" stopReason:"stop" outputChannelId:"25b09573-76c3-4983-abd0-a309a8fb0ec7":
█ Autocomplete:fastPathClient:fetch endpoint: "https://cody-gateway.sourcegraph.com/v1/completions/fireworks" outputChannelId: "f383292c-556e-4b07-80aa-0afa067ba321":
█ telemetry-v2 recordEvent: cody.completion/synthesizedFromParallelRequest:
█ Autocomplete:onComplete duration:"1305ms" stopReason:"stop" outputChannelId:"2fd9a1df-727e-406b-9520-9cc3edaff370":
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1286 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7633ms" stopReason:"undefined" outputChannelId:"af8fa54d-b89d-441d-a520-9dfe41558da1":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onError duration:"5615ms" endpoint:"https://sourcegraph.com/.api/completions/stream?api-version=2&client-name=vscode&client-version=1.40.2" outputChannelId:"235b4d67-a9cd-479c-8421-af407bae33be": {"outputChannelId":"235b4d67-a9cd-479c-8421-af407bae33be","duration":5615,"err":"socket hang up"}
█ SourcegraphNodeCompletionsClient request.on('close'): Connection closed without receiving any events (this may be due to an outage with the upstream LLM provider) trace-and-span: {"traceId":"d500d951ba1df5f1127356fa68b9d181","spanId":"9569a2c98b0583bb"}
█ ChatController: postError socket hang up:
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7501ms" stopReason:"undefined" outputChannelId:"e6fd03ea-64f1-4776-b436-80a060e25fac":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ telemetry-v2 recordEvent: cody.editChatButton/clicked:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1286 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7325ms" stopReason:"undefined" outputChannelId:"3f7c5606-51f6-4ae7-b60b-f4ef575589e8":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ telemetry-v2 recordEvent: cody.completion.stageCounter/flush:
█ telemetry-v2 recordEvent: cody.characters/flush:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ UpstreamHealth Ping took 723ms (Gateway: 972ms):
█ Autocomplete:onComplete duration:"12371ms" stopReason:"undefined" outputChannelId:"bbbfc2d1-aacf-4892-999d-1d2b6df7bef4":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7781ms" stopReason:"undefined" outputChannelId:"ac6d3466-a209-4db1-9a7a-7af577e3b6f0":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7518ms" stopReason:"undefined" outputChannelId:"9a4f692a-0720-478e-8ee6-884162ba8b0f":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"7484ms" stopReason:"undefined" outputChannelId:"29e4abae-96f7-4152-b58c-3d4e9b14984a":
█ telemetry-v2 recordEvent: cody.chatResponse/noCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"11458ms" stopReason:"undefined" outputChannelId:"b244fb2d-5972-4c89-8b6c-8589d60b49f8":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ telemetry-v2 recordEvent: cody.copyButton/clicked:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.ghostText/visible:
█ telemetry-v2 recordEvent: cody.keyDown/paste:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}
█ telemetry-v2 recordEvent: cody.humanMessageEditor/submit:
█ telemetry-v2 recordEvent: cody.chat-question/submitted:
█ DefaultPrompter.makePrompt Ignored 1288 chat messages due to context limit:
█ DefaultPrompter.makePrompt Ignored context due to context limit: user=0, corpus=0, previous=17:
█ telemetry-v2 recordEvent: cody.chat-question/executed:
█ ChatController streamAssistantResponse:
█ Autocomplete:onComplete duration:"15446ms" stopReason:"undefined" outputChannelId:"3da9a14a-5ecf-495b-a2c9-cd216af065bb":
█ telemetry-v2 recordEvent: cody.chatResponse/hasCode:
█ ClientConfigSingleton refreshing configuration:
█ ClientConfigSingleton refreshed: {"codyEnabled":true,"chatEnabled":true,"autoCompleteEnabled":true,"customCommandsEnabled":true,"attributionEnabled":false,"smartContextWindowEnabled":true,"modelsAPIEnabled":false,"latestSupportedCompletionsStreamAPIVersion":5}

Extension version: 1.40.2
VS Code version: Code 1.95.1 (65edc4939843c90c34d61f4ce11704f09d3e5cb6, 2024-10-31T05:14:54.222Z)
OS version: Windows_NT x64 10.0.19045
Modes:

System Info
Item Value
CPUs Intel(R) Core(TM) i5-7200U CPU @ 2.50GHz (4 x 2712)
GPU Status 2d_canvas: enabled
canvas_oop_rasterization: enabled_on
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
skia_graphite: disabled_off
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
webnn: disabled_off
Load (avg) undefined
Memory (System) 7.78GB (1.94GB free)
Process Argv --crash-reporter-id 7f387a3a-a8ee-44cc-b8ad-3bb3d6704b84
Screen Reader no
VM 0%
A/B Experiments
vsliv368:30146709
vspor879:30202332
vspor708:30202333
vspor363:30204092
vscod805cf:30301675
binariesv615:30325510
vsaa593:30376534
py29gd2263:31024239
c4g48928:30535728
azure-dev_surveyone:30548225
962ge761:30959799
pythongtdpath:30769146
pythonnoceb:30805159
asynctok:30898717
pythonmypyd1:30879173
h48ei257:31000450
pythontbext0:30879054
cppperfnew:31000557
dsvsc020:30976470
pythonait:31006305
dsvsc021:30996838
g316j359:31013175
dvdeprecation:31068756
dwnewjupyter:31046869
newcmakeconfigv2:31071590
impr_priority:31102340
nativerepl2:31139839
refactort:31108082
pythonrstrctxt:31112756
nativeloc2:31134642
cf971741:31144450
iacca1:31171482
notype1cf:31157160
5fd0e150:31155592
dwcopilot:31170013

Copy link

linear bot commented Nov 7, 2024

BUGS-687 Freezing up

@github-actions github-actions bot added bug Something isn't working repo/cody labels Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working repo/cody
Projects
None yet
Development

No branches or pull requests

1 participant