Tag: ollama
All the articles with the tag "ollama".
-
Run Local AI Models in VS Code Using Ollama and Continue
3 mins readFUll guide to connecting Ollama models to VS Code with Continue. Optimize your workflow by using a fast 1.5B model for autocomplete and a powerful 14B model for chat—all running 100% locally on your machine.