AI & MCP
Category: AI & MCP
What it does: Controls Kunobi's built-in MCP (Model Context Protocol) server and AI conversation features. The MCP server lets AI tools like Claude Code or OpenAI Codex query your live Kubernetes cluster context directly.
MCP Server
What is MCP?
The Model Context Protocol (MCP) is an open standard that lets AI tools connect to external data sources. Kunobi implements an MCP server that exposes your connected cluster's context — namespaces, resources, status — so AI coding assistants can answer Kubernetes questions with real, live data.
Example use cases:
- Ask Claude Code "what pods are failing in production right now?" — it queries your cluster directly
- Ask Codex to generate a manifest matching your existing resource naming conventions
- Use AI to help debug a Flux reconciliation error with full context of what's running
Enabling the MCP Server
- Navigate to Settings → AI & MCP
- Toggle MCP Server to On
- The server starts immediately and shows its bind address (e.g.,
localhost:3456)
MCP Port Configuration
Use the Server Port field to set which port the MCP server listens on. Each variant has its own default (e.g., 3500 for Local, 3400 for Dev). Set to 0 to use the variant default.
Change this only if the default port is already in use by another application on your machine.
MCP Tools Reference
The MCP server exposes these tools to connected AI clients:
| Tool | Description |
|---|---|
| describe | Get detailed information about a K8s resource |
| events | List Kubernetes events for a resource |
| pod_logs | Read container logs from a pod |
| monitoring | Query monitoring data |
| state | Query current application state |
| app_info | Get Kunobi app information (version, variant, etc.) |
These tools give AI assistants real-time access to your cluster context, enabling them to answer questions about running workloads, debug issues, and generate manifests based on actual cluster state.
CLI Integration
Once the MCP server is running, copy the ready-made commands from the CLI Integration section and run them in your terminal.
One-Command Installer
The fastest way to configure MCP across all supported clients:
npx @kunobi/mcp install
This automatically sets up MCP for Claude Code, Cursor, Windsurf, Gemini CLI, and Codex in a single step.
Removing MCP Integration
To unregister Kunobi from a specific client:
# Remove from Claude Code
claude mcp remove kunobi
# Remove from Codex
codex mcp remove kunobi
AI Conversation Settings
Auto-Create AI Conversation
Category: General
What it does: Automatically creates a new AI conversation when you open the AI tab in the terminal panel.
- Enabled (On): A new conversation is created automatically when you switch to the AI tab
- Disabled (Off): The AI tab opens empty; you create conversations manually
How to change: Settings → General → Auto-Create AI Conversation toggle
AI Conversations
Kunobi includes a built-in AI assistant accessible from the AI tab in the terminal panel. The assistant uses a specialized Kubernetes agent that can reference your live cluster context.
For full details on using AI conversations, see AI Conversations.
Troubleshooting
MCP Server Won't Start
- Check that the selected port is not in use by another application
- Try switching back to auto-select port
- Restart Kunobi if the server stays in a failed state
Claude Code Can't Connect
- Verify the MCP server is running (green status indicator in settings)
- Confirm the port in the Claude Code command matches the running port shown in settings
- Run
claude mcp listto check if thekunobiMCP server is registered correctly
Port Changes Between Restarts
This is expected behavior when using auto-select port. Update your Claude Code or Codex integration with the new port shown in settings after restarting Kunobi.
To avoid this, configure a fixed port in the settings.
MCP Ports by Variant
Each Kunobi variant listens on a dedicated MCP port so multiple variants can run simultaneously without conflicts. Ports follow the formula 3000 + (variant × 100):
| Variant | Window Title | MCP Port |
|---|---|---|
| Stable | Kunobi | 3200 |
| Unstable | Kunobi (Unstable) | 3300 |
| Dev | Kunobi (Dev) | 3400 |
| Local | Kunobi (Local) | 3500 |
If you set the port to
0in settings, the variant default from the table above is used. Unknown variants fall back to the Local port (3500).
If you run multiple variants at the same time (e.g., Stable and Dev), each is reachable on its own port. Make sure your CLI integration points to the correct port for the variant you want to interact with.