Enterprise MCP Server for Agentic LLMs
Open-source Model Context Protocol server exposing GitHub and internal CRUD APIs as tools to LLM agents. GitHub OAuth2 auth, token-bucket rate limiting via Redis (500 req/min). One-command Docker Compose deployment. 558 req/s throughput, 3.7ms health check latency, p50 API latency ~5ms.