Provides OAuth 2.0 authentication to ensure secure access to Kernel resources
Supports deploying Next.js applications to the Kernel platform
Supports deploying Node.js applications to the Kernel platform
Supports deploying Python applications to the Kernel platform
Supports deploying TypeScript applications to the Kernel platform
Kernel MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with secure access to Kernel platform tools and browser automation capabilities.
🌐 Use instantly at https://mcp.onkernel.com/mcp
— no installation required!
What is this?
The Kernel MCP Server bridges AI assistants (like Claude, Cursor, or other MCP-compatible tools) with the Kernel platform, enabling them to:
- 🚀 Deploy and manage Kernel apps in the cloud
- 🌐 Launch and control headless Chromium sessions for web automation
- 📊 Monitor deployments and track invocations
- 🔍 Search Kernel documentation and inject context
- 💻 Evaluate JavaScript and stream DOM snapshots
Open-source & fully-managed — the complete codebase is available here, and we run the production instance so you don't need to deploy anything.
The server uses OAuth 2.0 authentication via Clerk to ensure secure access to your Kernel resources.
🎯 First Time? Start Here!
Ready to try Kernel but don't see any apps yet? Perfect! Here's how to get started:
Step 1: Install Kernel MCP Server
Install the Kernel MCP server to your favorite MCP client using the setup instructions below.
Step 2: Ask Your AI Assistant for Help
Once connected, simply ask in your MCP client chat:
Your AI assistant will use the search_docs
tool to get you the latest quickstart instructions and guide you through setting up your first Kernel app!
Step 3: Deploy & Test with MCP Tools
After you have a sample app locally, ask your assistant:
Note: Be patient and wait until all tool parameters are fully generated before running the tool call.
Then test it:
Why This Approach?
- ✅ Always up-to-date - Your AI assistant fetches the latest docs
- ✅ Interactive guidance - Get help customized to your setup
- ✅ Learn MCP tools - Experience the power of
search_docs
,deploy_app
, andinvoke_action
- ✅ End-to-end workflow - From local development to cloud deployment to execution
What You'll Experience
Your AI assistant will help you:
- Download and understand sample apps (
search_docs
) - Deploy your local code to the cloud (
deploy_app
) - Run actions and see results (
invoke_action
) - Create browser sessions in the cloud (
create_browser
) - Monitor deployments (
list_deployments
,get_deployment
)
🚀 MCP Server Setup
First, add the Kernel MCP server to your favorite MCP-compatible client using https://mcp.onkernel.com/mcp
. Here are setup instructions for popular clients:
Claude
Team & Enterprise (Claude.ai)
- Navigate to Settings in the sidebar (web or desktop).
- Scroll to Integrations and click Add more.
- Fill in:
- Integration name:
Kernel
- Integration URL:
https://mcp.onkernel.com/mcp
- Integration name:
- Start a chat, enable Tools, and finish auth.
Free & Pro (Claude desktop)
Open ~/Library/Application Support/Claude/claude_desktop_config.json
and add:
Restart the Claude desktop app.
Claude Code CLI
Cursor
Manual Setup
- Press ⌘/Ctrl Shift J to open settings.
- Click Tools & Integrations.
- Click New MCP server.
- Add the following configuration:
- Save and the server will be available.
Goose
Goose Desktop
- Click
...
in the top right corner of the Goose Desktop. - Select
Advanced Settings
from the menu. - Under
Extensions
, clickAdd custom extension
. - On the
Add custom extension
modal, enter:- Type:
Streaming HTTP
- ID:
kernel
- Name:
Kernel
- Description:
Access Kernel's cloud-based browsers via MCP
- URL:
https://mcp.onkernel.com/mcp
- Timeout:
300
- Type:
- Click
Add
button.
Visual Studio Code
- Press ⌘/Ctrl P → search MCP: Add Server.
- Select Command (stdio).
- Enter:
- Name the server Kernel and press Enter.
- Activate via MCP: List Servers → Kernel → Start Server.
Windsurf
- Press ⌘/Ctrl , to open settings.
- Navigate Cascade → MCP servers → Add custom server.
- Paste:
Zed
Open settings.json
and add:
Others
Many other MCP-capable tools accept:
- Command:
npx
- Arguments:
-y mcp-remote https://mcp.onkernel.com/mcp
- Environment: (none)
Configure these values wherever the tool expects MCP server settings.
🛠️ Available MCP Tools
The server provides these tools for AI assistants:
Application Management
deploy_app
- Deploy TypeScript or Python apps to Kernellist_apps
- List apps in your Kernel organizationinvoke_action
- Execute actions in Kernel appsget_deployment
- Get deployment status and logslist_deployments
- List all deploymentsget_invocation
- Get action invocation details
Browser Automation
create_browser
- Launch a new browser sessionget_browser
- Get browser session informationdelete_browser
- Terminate a browser sessionlist_browsers
- List active browser sessions
Documentation & Search
search_docs
- Search Kernel platform documentation and guides
📚 Usage Examples
Deploy Local Apps to the Cloud
Invoke Apps from Anywhere
Create Persistent Browser Sessions
❓ Frequently Asked Questions
Is the server open source?
Yes — the code lives at github.com/onkernel/kernel-mcp-server. You're welcome to browse the code and contribute. We provide a hosted instance at https://mcp.onkernel.com/mcp
for convenience.
Does Kernel store my data? Only encrypted refresh tokens and minimal metadata required for auth; browser state lives in your Kernel organization and never leaves your tenancy.
What if the handshake fails? Restart your MCP client or disable/re-enable the Kernel server before opening a support ticket. Most connection issues resolve with a simple restart.
🤝 Contributing
We welcome contributions! Please see our contributing guidelines:
- Fork the repository and create your feature branch
- Make your changes and add tests if applicable
- Run the linter and formatter:
- Test your changes thoroughly
- Submit a pull request with a clear description
Development Guidelines
- Follow the existing code style and formatting
- Add TypeScript types for new functions and components
- Update documentation for any API changes
- Ensure all tests pass before submitting
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🔗 Related Projects
- Model Context Protocol - The protocol specification
- Kernel Platform - The platform this server integrates with
- Clerk - Authentication provider
- @onkernel/sdk - Kernel JavaScript SDK
💬 Support
- Issues & Bugs: GitHub Issues
- MCP Feedback: github.com/kernelxyz/mcp-feedback
- Documentation: Kernel Docs • MCP Setup Guide
- Community: Kernel Discord
Built with ❤️ by the Kernel Team
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A Model Context Protocol server that connects AI assistants with the Kernel platform, enabling them to deploy applications, automate web browsers, and manage cloud resources.
Related MCP Servers
- -securityFlicense-qualityA versatile Model Context Protocol server that enables AI assistants to manage calendars, track tasks, handle emails, search the web, and control smart home devices.Last updated -13Python
- -securityAlicense-qualityA Model Context Protocol server that enables AI assistants to interact with Kubernetes clusters through natural language, supporting core Kubernetes operations, monitoring, security, and diagnostics.Last updated -660PythonMIT License
- -securityFlicense-qualityA comprehensive Model Context Protocol server implementation that enables AI assistants to interact with file systems, databases, GitHub repositories, web resources, and system tools while maintaining security and control.Last updated -161TypeScript
- -securityFlicense-qualityA Model Context Protocol server that provides AI models with structured access to external data and services, acting as a bridge between AI assistants and applications, databases, and APIs in a standardized, secure way.Last updated -1Python