A CLI tool for interacting with Model Context Protocol (MCP) servers using natural language.
Built with mcp-use, build your own MCP application with our SDKs:
- 🤖 Natural language interface for MCP servers
- 💬 Interactive chat interface with tool call visualization
- ⚡ Direct integration with mcp-use (no API layer needed)
- 🚀 Single command installation
- 🔄 Over a dozen LLM providers (OpenAI, Anthropic, Google, Mistral, Groq, Cohere, and more)
- ⚙️ Slash commands for configuration (like Claude Code)
- 🔑 Smart API key prompting - automatically asks for keys when needed
- 💾 Persistent secure storage - encrypted keys and settings saved across sessions
-
Install and run:
$ npm install --global @mcp-use/cli $ mcp-use -
Choose your model (CLI handles API key setup automatically):
# Just pick a model - that's it! /model openai gpt-4o /model anthropic claude-3-5-sonnet-20240620 /model google gemini-1.5-pro /model groq llama-3.1-70b-versatile /model ollama llama3 # CLI will prompt: "Please enter your OPENAI API key:" # Paste your key and start chatting immediately! -
Get API keys when prompted from providers like:
Keys are stored securely encrypted in ~/.mcp-use-cli/config.json and persist across sessions.
If you prefer environment variables:
This CLI is a client for Model Context Protocol (MCP) servers. MCP servers act as tools that the AI can use. You need to connect the CLI to one or more servers to give it capabilities.
You can manage servers with the /server commands:
When you add a server, you'll be prompted for its JSON configuration. Here are examples for local and remote servers:
Local Server Example (e.g., a filesystem tool):
Remote Server Example (e.g., an SSE endpoint):
This configuration would be pasted directly into the CLI after running /server add.
Switch LLM providers and configure settings using slash commands:
- "List files in the current directory"
- "Create a new file called hello.txt with the content 'Hello, World!'"
- "Search for files containing 'TODO'"
- "What's the structure of this project?"
This CLI uses:
- Frontend: React + Ink for the terminal UI
- Agent: mcp-use MCPAgent for LLM + MCP integration
- LLM: Your choice of 12+ providers
- Transport: Direct TypeScript integration (no API layer)
This package uses Scarf to collect basic installation analytics to help us understand how the package is being used. This data helps us improve the tool and prioritize features.
Scarf collects:
- Operating system information
- IP address (used only for company lookup, not stored)
- Limited dependency tree information (hashed for privacy)
No personally identifying information is stored.
You can opt out of analytics in several ways:
Option 1: Environment variable
Option 2: Standard Do Not Track
Option 3: For package maintainers If you distribute a package that depends on this CLI, you can disable analytics for all your downstream users by adding this to your package.json:
For more information about Scarf and privacy, visit scarf.sh.
MIT
.png)

![I made Minecraft in Minecraft with redstone (2022) [video]](https://www.youtube.com/img/desktop/supported_browsers/chrome.png)

