api-to-mcp
npx skills add https://github.com/mcp-com-ai/api-to-mcp-skills --skill api-to-mcp
Agent 安装分布
Skill 文档
API to MCP Server Skill
Convert OpenAPI v3 specifications into fully functional MCP (Model Context Protocol) servers using HAPI CLI â no code required.
Workflow
Phase 1: Environment Setup
-
Verify HAPI CLI installation
hapi --versionIf not installed, install using:
- Linux/macOS:
curl -fsSL https://get.mcp.com.ai/hapi.sh | bash - Windows:
irm https://get.mcp.com.ai/hapi.ps1 | iex - Docker: Use
hapimcp/hapi-cli:latestimage directly
- Linux/macOS:
-
Validate OpenAPI spec is v3.x (not v2/Swagger). HAPI requires OpenAPI 3.0+ specifications.
Phase 2: Local Development & Testing
-
Start local MCP server
hapi serve <project-name> \ --openapi <path-or-url-to-openapi-spec> \ --url <backend-api-url> \ --headless \ --port 3030 -
Verify server health
curl -s http://localhost:3030/health # Expected: 200 OK curl -s http://localhost:3030/mcp/ping # Expected: pong -
Test MCP endpoint â The MCP transport is available at
/mcpusing Streamable HTTP:POST /mcpâ JSON-RPC messagesGET /mcpâ Server info (returns 204)
Phase 3: Deployment
Choose deployment target based on requirements:
Option A: Cloudflare Workers (Recommended for simplicity)
-
Authenticate (first time only):
hapi login -
Deploy:
hapi deploy \ --openapi <path-or-url-to-openapi-spec> \ --url <backend-api-url> \ --name <worker-name> \ --project <project-name> -
Verify deployment:
curl -s https://<worker-name>.<account>.workers.dev/health
Option B: Docker (Self-hosted)
docker run --name hapi-<project> -d --rm \
-p 3030:3030 \
hapimcp/hapi-cli:latest serve <project-name> \
--openapi <openapi-url> \
--url <backend-api-url> \
--port 3030 \
--headless
Phase 4: Verification Checklist
Before considering deployment complete:
-
/healthreturns 200 OK -
/mcp/pingreturnspong - At least one tool call succeeds via MCP transport
- Error responses include actionable messages
Key Concepts
Headless Mode
Use --headless flag when the MCP server proxies requests to an existing backend API. Without this flag, HAPI serves the OpenAPI spec documentation UI.
Streamable HTTP Transport
HAPI uses Streamable HTTP (not stdio). MCP clients connect via HTTP POST to /mcp endpoint. SSE is supported for streaming responses.
OpenAPI Spec Sources
HAPI accepts specs from:
- Local file path:
--openapi ./specs/api.yaml - Remote URL:
--openapi https://api.example.com/openapi.json - Pre-loaded specs in
~/.hapi/specs/<project>/
References
For detailed documentation, consult:
- references/hapi-cli-commands.md â Complete CLI reference
- references/deployment-patterns.md â Docker & Cloudflare recipes
- references/troubleshooting.md â Common issues & fixes
Quick Examples
Petstore API (Demo)
hapi serve petstore \
--openapi https://petstore3.swagger.io/api/v3/openapi.json \
--url https://petstore3.swagger.io/api/v3 \
--headless --port 3030
Custom API
hapi serve my-api \
--openapi https://my-api.example.com/openapi.json \
--url https://my-api.example.com \
--headless --port 3030
Deploy to Cloudflare
hapi deploy \
--openapi https://my-api.example.com/openapi.json \
--url https://my-api.example.com \
--name my-api-mcp