Using local Ollama with Opencode

Tiempo de lectura: < 1 minutoWe will learn today how we can use opencode with our Ollama directly on local. The API is exposed by default in: http://localhost:11434 Check it out: curl http://localhost:11434/api/tags You don’t have it? You need to install Ollama For Windows: irm https://ollama.com/install.ps1 | iex For Windows: curl -fsSL https://ollama.com/install.ps1 | iex We download a model, in … Read more







