martketneutral
martketneutral
Mmarimo
Created by martketneutral on 7/11/2024 in #help-support
"Generate with AI" fails / vLLM OpenAI compatible server
I am using vLLM to serve a local LLM using the vLLM OpenAI compatible server. I can set the URL, API key, and the Model successfully in the marimo Settings pane. When I hit "generate", marimo sees the server but the call violates a requirement for the chat role to alternate between "assistant" and "user".
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}
1 replies