Skip to content

MCP Server

Maina includes an MCP (Model Context Protocol) server that exposes its engines to any MCP-compatible IDE — Claude Code, Cursor, Windsurf, and others.

The install script handles this automatically:

Terminal window
curl -fsSL https://api.mainahq.com/install | bash

Or configure manually per tool:

.claude/settings.json (auto-created by maina init):

{"mcpServers":{"maina":{"command":"npx","args":["@mainahq/cli","--mcp"]}}}

No plugins, no extensions, no separate server process.

The MCP server exposes 10 tools, each delegating to the appropriate engine:

ToolDescriptionEngine
getContextGet focused codebase context for a command. Returns context assembled from the relevant layers with token budget applied.Context
getConventionsGet the project constitution and conventions. Returns .maina/constitution.md content.Prompt
verifyRun the verification pipeline on staged or specified files. Returns findings with severity and location.Verify
checkSlopCheck code for AI-generated slop patterns (filler words, hallucinated imports, dead code).Verify
reviewCodeRun two-stage review on a diff — spec compliance first, then code quality.Verify + Prompt
explainModuleGet a Mermaid dependency diagram for a directory. Visualizes module structure and cross-file references.Context
suggestTestsGenerate TDD test stubs from a plan.md file. Returns test code with red-green annotations.Prompt + Context
analyzeFeatureCheck spec/plan/tasks consistency for a feature. Reports mismatches and orphaned artifacts.Context
wikiQuerySearch and synthesize answers from codebase wiki. Returns AI-generated answer with source citations.Context (L5)
wikiStatusWiki health dashboard — article counts, coverage, staleness.Context
IDE (Claude Code, Cursor, etc.)
|
| MCP protocol (JSON-RPC over stdio)
|
v
maina --mcp
|
+-- getContext -------> Context Engine (4 layers, dynamic budget)
+-- getConventions ---> Prompt Engine (constitution + custom prompts)
+-- verify -----------> Verify Engine (full pipeline)
+-- checkSlop --------> Verify Engine (slop detector)
+-- reviewCode -------> Verify Engine (two-stage review)
+-- explainModule ----> Context Engine (semantic layer + tree-sitter)
+-- suggestTests -----> Prompt Engine + Context Engine
+-- analyzeFeature ---> Context Engine (cross-artifact analysis)
+-- wikiQuery -------> Context Engine (L5 wiki layer)
+-- wikiStatus ------> Context Engine (wiki health)
|
+-- Cache (all tools respect the 3-layer cache)

Every tool call goes through the same engines as the CLI commands. The cache ensures repeated queries return instantly.

With the MCP server running, your IDE’s AI assistant gains:

  • Codebase awarenessgetContext provides the same 4-layer context the CLI uses, so the AI sees relevant code, not everything.
  • Convention enforcementgetConventions injects your constitution into every interaction.
  • Inline verificationverify and checkSlop catch issues as you write, not after you commit.
  • Review on demandreviewCode gives you the same two-stage review the CLI provides, inside your editor.
  • Test suggestionssuggestTests generates TDD stubs from your plan files.
  • Wiki searchwikiQuery searches compiled codebase knowledge and returns AI-synthesized answers with citations.

The MCP server is the same three engines, accessible from any tool that speaks MCP.