Crimson Nebula
Crimson Nebula
Mmarimo
Created by Crimson Nebula on 1/11/2025 in #help-support
How to use local AI model like Llama instead of OpenAI
Has anyone tried using a local API endpoint for an AI assistant instead of the OpenAI API? I have Ollama running on my laptop, along with several models to assist with my tasks. That’s why I want to use this local model instead of relying on Anthropic, OpenAI, or Google. By the way, I’m using Llama and Open WebUI.
11 replies