Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

groq API: I have openai.BadRequestError: Error code: 400 #375

Open
aborruso opened this issue Jan 4, 2025 · 1 comment
Open

groq API: I have openai.BadRequestError: Error code: 400 #375

aborruso opened this issue Jan 4, 2025 · 1 comment
Labels
bug Something isn't working provider Relating to LLM providers

Comments

@aborruso
Copy link

aborruso commented Jan 4, 2025

Hi @ErikBjare ,
after I run the program, I have

[09:34:23] Browser tool available (using playwright)
[09:34:26] Using model: groq/llama-3.1-8b-instant
[09:34:27] INFO     Using fallback metadata for unknown model llama-3.1-8b-instant from groq                                 models.py:185
[09:34:27] Using logdir ~/.local/share/gptme/logs/2025-01-04-sprinting-angry-monster
           Using workspace at ~/git/progetti/tasks

Then, if I write compute fib 10 I have

openai.BadRequestError: Error code: 400 - {'error': {'message': "'messages.0' : for 'role:system' the following must be satisfied[('messages.0.content' : value must be a string)]", 'type': 'invalid_request_error'}}

What am I doing wrong?

Thank you

image

@ErikBjare
Copy link
Owner

Looks like we've accidentally broken Groq support during refactor, I will investigate.

@ErikBjare ErikBjare added bug Something isn't working provider Relating to LLM providers labels Jan 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working provider Relating to LLM providers
Projects
None yet
Development

No branches or pull requests

2 participants