MCP Overview
Stackie’s MCP server gives AI agents full control over sandboxes, commands, logs, and health data through a structured JSON-RPC interface.
What is MCP?
The Model Context Protocol is an open standard for connecting AI agents to external tools and data sources. Stackie’s MCP server lets agents like Claude, GPT-4, and custom LLM clients control sandboxes, execute commands, stream logs, and query health data — all via a structured JSON-RPC interface.
Transport Modes
Stackie supports two transports:
- SSE (Server-Sent Events) — HTTP-based streaming. Agents connect to
http://localhost:<port>/sseand receive a stream of JSON-RPC messages. This is the default transport used by Claude Code and most web-capable agents. - stdio — Launch as a subprocess via
stackie mcp serve --stdio. Used by agents that spawn MCP servers as child processes.
Tool Discovery
When an agent connects, it sends initialize followed by tools/list to
discover available tools. No configuration is needed — agents discover
capabilities automatically at connection time.
Authentication
The MCP server uses bearer tokens. Each agent gets a unique token at enrollment time (written to the agent’s config file). Tokens are stored as SHA-256 hashes in the database — raw tokens are never persisted.
Resource Providers
In addition to tools, the MCP server exposes resource URIs:
db://{sandbox_id}/{table}— Query sandbox databasesfs://{sandbox_id}/{path}— Read files inside sandboxes
Resources support both resources/read (single fetch) and
resources/subscribe (live updates via SSE).