OpenRouter provides access to various AI models from OpenAI, Anthropic, and others. However, GitHub Copilot's Agent Mode requires models to support function calling/tools, but OpenRouter's API doesn't announce tool support for its models.
This prevents using powerful models like Claude, GPT-4, and others through OpenRouter with Copilot's advanced Agent Mode features.
copilot-ollama creates a local proxy chain that:
- 🔄 Forwards requests to OpenRouter while preserving tool support
- 🛠️ Makes OpenRouter models compatible with Copilot's Ollama integration
- 🚀 Enables Agent Mode with any OpenRouter model
- 🔧 Uses LiteLLM for OpenAI-compatible proxying
- 🔗 Uses oai2ollama for Ollama compatibility
- uv package manager
- OpenRouter API key (get one here)
- VSCode with GitHub Copilot extension
-
Clone and navigate to the project
git clone https://github.com/bascodes/copilot-ollama.git cd copilot-ollama -
Set your OpenRouter API key
export OPENROUTER_API_KEY=your_openrouter_api_key_here -
Start the proxy servers
-
Configure VSCode
- Open VSCode settings
- Set github.copilot.chat.byok.ollamaEndpoint to http://localhost:11434
- Click "Manage Models" → Select "Ollama"
-
Start coding! 🎉 Your OpenRouter models are now available in Copilot Agent Mode.
Edit config.yaml to add or modify available models:
Here are some recommended models to add:
| claude-3-sonnet | openrouter/anthropic/claude-3-sonnet | Excellent for code generation |
| gpt-4-turbo | openrouter/openai/gpt-4-turbo | Latest GPT-4 with improved performance |
| mixtral-8x7b | openrouter/mistralai/mixtral-8x7b-instruct | Fast and capable open-source model |
| llama-3-70b | openrouter/meta-llama/llama-3-70b-instruct | Meta's powerful open model |
- VSCode Copilot sends requests to what it thinks is an Ollama server
- oai2ollama translates Ollama API calls to OpenAI format
- LiteLLM proxies OpenAI-compatible requests to OpenRouter
- OpenRouter routes to the actual AI model providers
- Tool/function calling capabilities are preserved throughout the chain
We welcome contributions! Here's how you can help:
- 🐛 Report bugs by opening an issue
- 💡 Suggest features or improvements
- 📖 Improve documentation
- 🔧 Submit pull requests
This project is licensed under the MIT License - see the LICENSE file for details.
- LiteLLM for the excellent proxy framework
- oai2ollama for Ollama compatibility
- OpenRouter for model access
- The VSCode and GitHub Copilot teams
⭐ Star this repo if it helped you unlock Copilot Agent Mode with your favorite models!
.png)

