martketneutral
martketneutral5mo ago

"Generate with AI" fails / vLLM OpenAI compatible server

I am using vLLM to serve a local LLM using the vLLM OpenAI compatible server. I can set the URL, API key, and the Model successfully in the marimo Settings pane. When I hit "generate", marimo sees the server but the call violates a requirement for the chat role to alternate between "assistant" and "user".
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}
0 Replies
No replies yetBe the first to reply to this messageJoin