One-Click Deployment for Any MCP Server
Written by punkpeye on .
- Why We Built This
- Getting Started with Public Servers
- Adding Your Own Server
- Connect From Any Client
- Enterprise-Grade Security
- Complete Observability
- Performance That Doesn't Keep You Waiting
- Our Vision for MCP
- Supporting Open Source (Monetization)
We launched our MCP directory on November 25, 2024, just days after Anthropic's public announcement. That's how we started and that's what we're best known for. However, since then we've evolved to provide comprehensive one-click deployment MCP server hosting.
What makes Glama special? Every server runs in its own isolated VM with dedicated resources, persistent state, end-to-end encryption, and built-in observability.
Why We Built This
When we started Glama, nobody knew where MCP was heading. Would servers need state? Would they run locally or remotely? Instead of guessing, we built a platform that could do both – with persistent state and proper security from day one.
Getting Started with Public Servers
The easiest way to experience Glama is through our public servers. We index the entire MCP ecosystem – all 9,000+ servers. For each one, we build Docker images, index available tools, prompts, and resources, and scan for vulnerabilities.
Try it yourself:
- Go to glama.ai/mcp/servers/@punkpeye/mcp-ping
- Click "Deploy Server"
That's it!
The server deploys in seconds and becomes immediately available in your Glama Workspace (just call it with @mcp-ping
).
Adding Your Own Server
Ready to deploy your own code? Every server is essentially a Docker container. Go to glama.ai/mcp/servers, click "Add Server", then configure your Dockerfile
in Admin. We will run a few checks and deploy the server.
Connect From Any Client
Once deployed, your servers work everywhere. Every server is accessible via SSE or Streamable HTTP, meaning you can use them with any MCP-capable client – VS Code, Cursor, and others.
Just grab your URL from glama.ai/settings/mcp/servers (it includes your private access token) and configure your client.
Enterprise-Grade Security
Security isn't an afterthought – it's fundamental to our architecture. MCPs can run arbitrary code, so we use Firecracker VMs to completely isolate each instance. Every server gets its own dedicated environment with no shared resources. This is the highest level of separation available, and it's why enterprises trust us.
Complete Observability
Since we host every MCP server, you get complete visibility into what's happening. Access OS-level logs and application event traces for every deployment. This visibility is crucial when debugging why MCPs misbehave or understanding exactly what data flows through your MCPs.
Performance That Doesn't Keep You Waiting
Who likes waiting for their MCP servers to boot? Nobody. That's why every deployment gets:
- Dedicated CPU and memory
- High IOPS storage
- Sub-second boot times
Coming soon: Multi-CPU support, GPU access, and persistent volumes.
Our Vision for MCP
We want Glama to be the most trusted MCP hosting platform for personal use and enterprises. Today, over 50,000 companies and individuals use our servers to experiment with MCP and for production workloads. We see the strongest adoption from users working with coding agents and automation platforms within the Glama workspace. We will continue to invest into MCP ecosystem, and in particular to:
- Observability
- Monetization for Open-Source maintainers
- Security (e.g. better secret management)
- Open-Source frameworks and libraries like FastMCP and mcp-proxy
Supporting Open Source (Monetization)
We're big open-source proponents, but we also believe maintainers deserve support. We're actively building monetization tools for server authors to sustain their projects.
Interested in our preview program? Email frank@glama.ai.
Ready to deploy your first MCP server? Visit glama.ai/mcp/servers.
Written by punkpeye (@punkpeye)