Plugins with litellm/ollama/mistral? #1304
-
Contact DetailsNo response What happened?Maybe it is expected and only works with openAI? I could have missed it from the documentation.
Or I do see:
Steps to Reproduce
What browsers are you seeing the problem on?No response Relevant log outputLibreChat | [chain/error] [1:chain:AgentExecutor] [3.88s] Chain run errored with error: "AbortError"
LibreChat | Error: AbortError
LibreChat | at EventTarget.<anonymous> (/app/api/node_modules/langchain/dist/chains/base.cjs:109:36)
LibreChat | at [nodejs.internal.kHybridDispatch] (node:internal/event_target:761:20)
LibreChat | at EventTarget.dispatchEvent (node:internal/event_target:703:26)
LibreChat | at abortSignal (node:internal/abort_controller:314:10)
LibreChat | at AbortController.abort (node:internal/abort_controller:332:5)
LibreChat | at abortController.abortCompletion (/app/api/server/middleware/abortMiddleware.js:48:21)
LibreChat | at abortMessage (/app/api/server/middleware/abortMiddleware.js:15:37)
LibreChat | at /app/api/server/middleware/abortMiddleware.js:27:20
LibreChat | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
LibreChat | [handleResponseMessage] Output: {
LibreChat | output: 'Encountered an error while attempting to respond. Error: AbortError',
LibreChat | errorMessage: 'AbortError',
LibreChat | intermediateSteps: []
LibreChat | }
LibreChat | [llm/error] [2:llm:ChatOpenAI] [3.88s] LLM run errored with error: "Request was aborted."
LibreChat | handleLLMError: {"context":"plugins","conversationId":"efc7605e-3e14-40f5-abd4-ecd7af549f7f","initialMessageCount":6}
LibreChat | Error [AbortError]: Request was aborted.
LibreChat | at wrapOpenAIClientError (/app/api/node_modules/langchain/dist/util/openai.cjs:13:17)
LibreChat | at /app/api/node_modules/langchain/dist/chat_models/openai.cjs:621:69
LibreChat | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
LibreChat | at async RetryOperation._fn (/app/node_modules/p-retry/index.js:50:12) {
LibreChat | attemptNumber: 1,
LibreChat | retriesLeft: 6
LibreChat | }
LibreChat | [llm/error] [1:llm:ChatOpenAI] [3.87s] LLM run errored with error: "Request was aborted."
LibreChat | Error in handler Handler, handleLLMError: Error: AbortError: Request was aborted.
chat-meilisearch | [2023-12-07T13:33:27Z INFO actix_web::middleware::logger] 172.26.0.5 "GET /indexes/convos/documents/efc7605e-3e14-40f5-abd4-ecd7af549f7f HTTP/1.1" 200 168 "-" "undici" 0.000154
LibreChat | CLIENT RESPONSE
LibreChat | {
LibreChat | messageId: '37dc646f-9607-427e-86a3-195b8df5eb5b',
LibreChat | conversationId: 'efc7605e-3e14-40f5-abd4-ecd7af549f7f',
LibreChat | parentMessageId: 'c9b451f6-e952-413b-b4cc-b7add554bdd6',
LibreChat | isCreatedByUser: false,
LibreChat | isEdited: undefined,
LibreChat | model: 'ollama/mistral',
LibreChat | sender: 'OpenAI',
LibreChat | promptTokens: 133,
LibreChat | error: true,
LibreChat | text: 'Encountered an error while attempting to respond. Error: AbortError',
LibreChat | intermediateSteps: []
LibreChat | } ScreenshotsNo response Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
Yes. This is not optimized for non-OpenAI and relies heavily on OpenAI functions. In the future, this will be different and working on the groundwork to make this all possible, but for now, compatibility cannot be expected. |
Beta Was this translation helpful? Give feedback.
-
Does librechat support liteLLM in general? |
Beta Was this translation helpful? Give feedback.
Yes. This is not optimized for non-OpenAI and relies heavily on OpenAI functions. In the future, this will be different and working on the groundwork to make this all possible, but for now, compatibility cannot be expected.