*VS Code Copilot now supports local AI models in llama.cpp or ollama. Unlimited free tokens, works offline, full privacy, and support for images for on device AI vision and agent mode with function calling- it’s all in.
I have 10x better results without the token limits. All tokens are “on the house”; literally on my house, lol. No sneaky telemetry, no subscription math, no “User Premium Prompt …*