The base URL of your Ollama server (without /api)
The default model to use for conversations (e.g., llama2, codellama, mistral)
Download and install Ollama from ollama.ai
ollama.ai
ollama pull llama2
ollama serve
llama2, codellama, mistral, phi, gemma