[Question]: Model specs usage with gptPlugins over Azure AI model issue #3064
Unanswered
childotg
asked this question in
Troubleshooting
Replies: 2 comments
-
When you use Plugins without modelSpecs, does your list show "gpt-35-turbo-0125"? Note, it's recommended you use the actual model identifier, as defined by OpenAI, when configuring this: models:
gpt-3.5-turbo-0125: # using "." in 3.5
deploymentName: "gpt-35-turbo-0125" # actual deployment name with "."
gpt-4o-2024-05-13:
deploymentName: "gpt-4o-2024-05-13" Maybe the mismatch is coming from there. The same config works for me (Azure using plugins, with your modelSpec) |
Beta Was this translation helpful? Give feedback.
0 replies
-
@danny-avila thank you. Can you please share your conf? From my side it's still not working |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What is your question?
It may be connected to this #2972 but for me it does not work.
This is the configuration:
This is the error:
This model
gpt-35-turbo-0125
is defined as is:endpoints:
azureOpenAI:
plugins: true
assistants: false
summarize: true
summaryModel: "current_model"
titleConvo: true
titleModel: "current_model"
titleMethod: "functions"
groups:
- group: "..." # arbitrary name
apiKey: "..."
instanceName: "..." # name of the resource group or instance
version: "2024-03-01-preview"
assistants: false
models:
gpt-35-turbo-0125:
deploymentName: gpt-35-turbo-0125
gpt-4o-2024-05-13:
deploymentName: gpt-4o-2024-05-13
More Details
What is the main subject of your question?
Endpoints
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions