Skip to main content
Glama
portel-dev

NCP - Natural Context Provider

by portel-dev

npm version npm downloads GitHub release downloads Latest release License: Elastic-2.0 MCP Compatible

NCP - Natural Context Provider

1 MCP to rule them all

Your MCPs, supercharged. Find any tool instantly, execute with code mode, run on schedule, discover skills, load Photons, ready for any client. Smart loading saves tokens and energy.

๐Ÿ’ What is NCP?

Instead of your AI juggling 50+ tools scattered across different MCPs, NCP gives it a single, unified interface with code mode execution, scheduling, skills discovery, and custom Photons.

Your AI sees just 2-3 simple tools:

  • find - Search for any tool, skill, or Photon: "I need to read a file" โ†’ finds the right tool automatically

  • code - Execute TypeScript directly: await github.create_issue({...}) (code mode, enabled by default)

  • run - Execute tools individually (when code mode is disabled)

Behind the scenes, NCP manages all 50+ tools + skills + Photons: routing requests, discovering the right capability, executing code, scheduling tasks, managing health, and caching responses.

NCP Transformation Flow

Why this matters:

  • Your AI stops analyzing "which tool do I use?" and starts doing actual work

  • Code mode lets AI write multi-step TypeScript workflows combining tools, skills, and scheduling

  • Skills provide domain expertise: canvas design, PDF manipulation, document generation, more

  • Photons enable custom TypeScript MCPs without npm publishing

  • 97% fewer tokens burned on tool confusion (2,500 vs 103,000 for 80 tools)

  • 5x faster responses (sub-second tool selection vs 5-8 seconds)

  • Your AI becomes focused. Not desperate.

๐Ÿš€ NEW: Project-level configuration - each project can define its own MCPs automatically

What's MCP? The Model Context Protocol by Anthropic lets AI assistants connect to external tools and data sources. Think of MCPs as "plugins" that give your AI superpowers like file access, web search, databases, and more.


๐Ÿ“‘ Quick Navigation


๐Ÿ˜ค The MCP Paradox: From Assistant to Desperate

You gave your AI assistant 50 tools to be more capable. Instead, you got desperation:

  • Paralyzed by choice ("Should I use read_file or get_file_content?")

  • Exhausted before starting ("I've spent my context limit analyzing which tool to use")

  • Costs explode (50+ tool schemas burn tokens before any real work happens)

  • Asks instead of acts (used to be decisive, now constantly asks for clarification)


๐Ÿงธ Why Too Many Tools Break the System

Think about it like this:

A child with one toy โ†’ Treasures it, masters it, creates endless games with it A child with 50 toys โ†’ Can't hold them all, gets overwhelmed, stops playing entirely

Your AI is that child. MCPs are the toys. More isn't always better.

The most creative people thrive with constraints, not infinite options. A poet given "write about anything" faces writer's block. Given "write a haiku about rain"? Instant inspiration.

Your AI is the same. Give it one perfect tool โ†’ Instant action. Give it 50 tools โ†’ Cognitive overload. NCP provides just-in-time tool discovery so your AI gets exactly what it needs, when it needs it.


๐Ÿ“Š The Before & After Reality

Before NCP: Desperate Assistant ๐Ÿ˜ตโ€๐Ÿ’ซ

When your AI assistant manages 50 tools directly:

๐Ÿค– AI Assistant Context: โ”œโ”€โ”€ Filesystem MCP (12 tools) โ”€ 15,000 tokens โ”œโ”€โ”€ Database MCP (8 tools) โ”€โ”€โ”€ 12,000 tokens โ”œโ”€โ”€ Web Search MCP (6 tools) โ”€โ”€ 8,000 tokens โ”œโ”€โ”€ Email MCP (15 tools) โ”€โ”€โ”€โ”€โ”€ 18,000 tokens โ”œโ”€โ”€ Shell MCP (10 tools) โ”€โ”€โ”€โ”€โ”€ 14,000 tokens โ”œโ”€โ”€ GitHub MCP (20 tools) โ”€โ”€โ”€โ”€ 25,000 tokens โ””โ”€โ”€ Slack MCP (9 tools) โ”€โ”€โ”€โ”€โ”€โ”€ 11,000 tokens ๐Ÿ’€ Total: 80 tools = 103,000 tokens of schemas

What happens:

  • AI burns 50%+ of context just understanding what tools exist

  • Spends 5-8 seconds analyzing which tool to use

  • Often picks wrong tool due to schema confusion

  • Hits context limits mid-conversation

After NCP: Executive Assistant โœจ

With NCP as Chief of Staff:

๐Ÿค– AI Assistant Context: โ””โ”€โ”€ NCP (2 unified tools) โ”€โ”€โ”€โ”€ 2,500 tokens ๐ŸŽฏ Behind the scenes: NCP manages all 80 tools ๐Ÿ“ˆ Context saved: 100,500 tokens (97% reduction!) โšก Decision time: Sub-second tool selection ๐ŸŽช AI behavior: Confident, focused, decisive

Real results from our testing:

Your MCP Setup

Without NCP

With NCP

Token Savings

Small

(5 MCPs, 25 tools)

15,000 tokens

8,000 tokens

47% saved

Medium

(15 MCPs, 75 tools)

45,000 tokens

12,000 tokens

73% saved

Large

(30 MCPs, 150 tools)

90,000 tokens

15,000 tokens

83% saved

Enterprise

(50+ MCPs, 250+ tools)

150,000 tokens

20,000 tokens

87% saved

Translation:

  • 5x faster responses (8 seconds โ†’ 1.5 seconds)

  • 12x longer conversations before hitting limits

  • 90% reduction in wrong tool selection

  • Zero context exhaustion in typical sessions


๐Ÿ“‹ Prerequisites

  • Node.js 18+ (Download here)

  • npm (included with Node.js) or npx for running packages

  • Command line access (Terminal on Mac/Linux, Command Prompt/PowerShell on Windows)

๐Ÿš€ Installation

Choose your MCP client for setup instructions:

Client

Description

Setup Guide

Claude Desktop

Anthropic's official desktop app.

Best for NCP

- one-click .dxt install with auto-sync

โ†’ Full Guide

Claude Code

Terminal-first AI workflow. Works out of the box!

Built-in support

VS Code

GitHub Copilot with Agent Mode. Use NCP for semantic tool discovery

โ†’ Setup

Cursor

AI-first code editor with Composer. Popular VS Code alternative

โ†’ Setup

Windsurf

Codeium's AI-native IDE with Cascade. Built on VS Code

โ†’ Setup

Cline

VS Code extension for AI-assisted development with MCP support

โ†’ Setup

Continue

VS Code AI assistant with Agent Mode and local LLM support

โ†’ Setup

Want more clients?

See the full list of MCP-compatible clients and tools

Official MCP Clients

โ€ข

Awesome MCP

Other Clients

Any MCP-compatible client via npm

Quick Start โ†“


Quick Start (npm)

For advanced users or MCP clients not listed above:

Step 1: Install NCP

npm install -g @portel/ncp

Step 2: Import existing MCPs (optional)

ncp config import # Paste your config JSON when prompted

Step 3: Configure your MCP client

Add to your client's MCP configuration:

{ "mcpServers": { "ncp": { "command": "ncp" } } }

โœ… Done! Your AI now sees just 2 tools instead of 50+.

NCP List Overview


๐Ÿงช Test Drive: See the Difference Yourself

Want to experience what your AI experiences? NCP has a human-friendly CLI:

๐Ÿ” Smart Discovery

# Ask like your AI would ask: ncp find "I need to read a file" ncp find "help me send an email" ncp find "search for something online"

NCP Find Command

Notice: NCP understands intent, not just keywords. Just like your AI needs.

๐Ÿ“‹ Ecosystem Overview

# See your complete MCP ecosystem: ncp list --depth 2 # Get help anytime: ncp --help

NCP Help Command

โšก Direct Testing

# Test any tool safely: ncp run filesystem read_file --path "/tmp/test.txt"

Why this matters: You can debug and test tools directly, just like your AI would use them.

โœ… Verify Everything Works

# 1. Check NCP is installed correctly ncp --version # 2. Confirm your MCPs are imported ncp list # 3. Test tool discovery ncp find "file" # 4. Test a simple tool (if you have filesystem MCP) ncp run filesystem read_file --path "/tmp/test.txt" --dry-run

โœ… Success indicators:

  • NCP shows version number

  • ncp list shows your imported MCPs

  • ncp find returns relevant tools

  • Your AI client shows only NCP in its tool list


๐Ÿ’ช From Tools to Automation: The Real Power

You've seen find (discover tools) and code (execute TypeScript). Individually, they're useful. Together with scheduling, they become an automation powerhouse.

A Real Example: The MCP Conference Scraper

We wanted to stay on top of MCP-related conferences and workshops for an upcoming release. Instead of manually checking websites daily, we asked Claude:

"Set up a daily scraper that finds MCP conferences and saves them to a CSV file"

What Claude did:

  1. Used

    // Search the web for MCP conferences const results = await web.search({ query: "Model Context Protocol conference 2025" }); // Read each result and extract details for (const url of results) { const content = await web.read({ url }); // Extract title, deadline, description... // Save to ~/.ncp/mcp-conferences.csv }
  2. Used

    ncp schedule create code:run "every day at 9am" \ --name "MCP Conference Scraper" \ --catchup-missed

How to set this up yourself:

First, install the web photon (provides search and read capabilities):

# Install from the official photons repo ncp photon add https://raw.githubusercontent.com/portel-dev/photons/main/web.photon.ts

Then ask Claude to create the scraper - it will use the web photon automatically.

What happens now:

  • Every morning at 9am, the scraper runs automatically

  • Searches for new MCP events and adds them to the CSV

  • If our laptop was closed at 9am, it catches up when we open it

  • We wake up to fresh conference data - no manual work

The insight: find and code let AI write automation. schedule makes it run forever. That's the powerhouse.


๐Ÿ’ก Why NCP Transforms Your AI Experience

๐Ÿง  From Desperation to Delegation

  • Desperate Assistant: "I see 50 tools... which should I use... let me think..."

  • Executive Assistant: "I need file access. Done." (NCP handles the details)

๐Ÿ’ฐ Massive Token Savings

  • Before: 100k+ tokens burned on tool confusion

  • After: 2.5k tokens for focused execution

  • Result: 40x token efficiency = 40x longer conversations

๐ŸŽฏ Eliminates Choice Paralysis

  • Desperate: AI freezes, picks wrong tool, asks for clarification

  • Executive: NCP's Chief of Staff finds the RIGHT tool instantly

๐Ÿš€ Confident Action

  • Before: 8-second delays, hesitation, "Which tool should I use?"

  • After: Instant decisions, immediate execution, zero doubt

Bottom line: Your AI goes from desperate assistant to executive assistant.


โšก Supercharged Features

Here's exactly how NCP empowers your MCPs:

Feature

What It Does

Why It Matters

๐Ÿ” Instant Tool Discovery

Semantic search understands intent ("read a file") not just keywords

Your AI finds the RIGHT tool in <1s instead of analyzing 50 schemas

๐Ÿ“ฆ On-Demand Loading

MCPs and tools load only when needed, not at startup

Saves 97% of context tokens - AI starts working immediately

โฐ Automated Scheduling

Run any tool on cron schedules or natural language times

Background automation without keeping AI sessions open

๐Ÿ”Œ Universal Compatibility

Works with Claude Desktop, Claude Code, Cursor, VS Code, and any MCP client

One configuration for all your AI tools - no vendor lock-in

๐Ÿ’พ Smart Caching

Intelligent caching of tool schemas and responses

Eliminates redundant indexing - energy efficient and fast

The result: Your MCPs go from scattered tools to a unified, intelligent system that your AI can actually use effectively.


๐Ÿ› ๏ธ For Power Users: Manual Setup

Prefer to build from scratch? Add MCPs manually:

# Add the most popular MCPs: # AI reasoning and memory ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking ncp add memory npx @modelcontextprotocol/server-memory # File and development tools ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents # Path: directory to access ncp add github npx @modelcontextprotocol/server-github # No path needed # Search and productivity ncp add brave-search npx @modelcontextprotocol/server-brave-search # No path needed

NCP Add Command

๐Ÿ’ก Pro tip: Browse Smithery.ai (2,200+ MCPs) or mcp.so to discover tools for your specific needs.


๐ŸŽฏ Popular MCPs That Work Great with NCP

๐Ÿ”ฅ Most Downloaded

# Community favorites (download counts from Smithery.ai): ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking # 5,550+ downloads ncp add memory npx @modelcontextprotocol/server-memory # 4,200+ downloads ncp add brave-search npx @modelcontextprotocol/server-brave-search # 680+ downloads

๐Ÿ› ๏ธ Development Essentials

# Popular dev tools: ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/code ncp add github npx @modelcontextprotocol/server-github ncp add shell npx @modelcontextprotocol/server-shell

๐ŸŒ Productivity & Integrations

# Enterprise favorites: ncp add gmail npx @mcptools/gmail-mcp ncp add slack npx @modelcontextprotocol/server-slack ncp add google-drive npx @modelcontextprotocol/server-gdrive ncp add postgres npx @modelcontextprotocol/server-postgres ncp add puppeteer npx @hisma/server-puppeteer

๐Ÿค– Internal MCPs

NCP includes powerful internal MCPs that extend functionality beyond external tool orchestration:

Scheduler MCP - Automate Any Tool

Schedule any MCP tool to run automatically using cron or natural language schedules.

# Schedule a daily backup check ncp run schedule:create --params '{ "name": "Daily Backup", "schedule": "every day at 2am", "tool": "filesystem:list_directory", "parameters": {"path": "/backups"} }'

Features:

  • โœ… Natural language schedules ("every day at 9am", "every monday")

  • โœ… Standard cron expressions for advanced control

  • โœ… Automatic validation before scheduling

  • โœ… Execution history and monitoring

  • โœ… Works even when NCP is not running (system cron integration)

โ†’ Full Scheduler Guide

MCP Management MCP - Install MCPs from AI

Install and configure MCPs dynamically through natural language.

# AI can discover and install MCPs for you ncp find "install mcp" # Shows: mcp:add, mcp:remove, mcp:list

Features:

  • โœ… Search and discover MCPs from registries

  • โœ… Install MCPs without manual configuration

  • โœ… Update and remove MCPs programmatically

  • โœ… AI can self-extend with new capabilities

Skills Management MCP - Extend Claude with Plugins

Manage Anthropic Agent Skills - modular extensions that add specialized knowledge and tools to Claude.

// Discover skills using vector search const results = await skills.find({ query: "canvas design" }); // Install a skill await skills.add({ skill_name: "canvas-design" }); // List installed skills const installed = await skills.list(); // Read skill resources const template = await skills.read_resource({ skill_name: "canvas-design", file_path: "resources/templates.md" });

Features:

  • โœ… Vector-powered semantic search for skills

  • โœ… One-command install from official marketplace

  • โœ… Progressive disclosure (metadata โ†’ full content โ†’ resources)

  • โœ… Official Anthropic marketplace integration

  • โœ… Custom marketplace support

  • โœ… Auto-loading of installed skills

โ†’ Full Skills Guide

Analytics MCP - Visualize Usage & Performance

View usage statistics, token savings, and performance metrics directly in your chat.

# View usage overview with ASCII charts ncp run analytics:overview --params '{"period": 7}'

Features:

  • โœ… Usage trends and most used tools

  • โœ… Token savings analysis (Code-Mode efficiency)

  • โœ… Performance metrics (response times, error rates)

  • โœ… ASCII-formatted charts for AI consumption

Configuration: Internal MCPs are disabled by default. Enable in your profile settings:

{ "settings": { "enable_schedule_mcp": true, "enable_mcp_management": true, "enable_skills": true, "enable_analytics_mcp": true } }

๐Ÿ”ง Advanced Features

Smart Health Monitoring

NCP automatically detects broken MCPs and routes around them:

ncp list --depth 1 # See health status ncp config validate # Check configuration health

๐ŸŽฏ Result: Your AI never gets stuck on broken tools.

Multi-Profile Organization

Organize MCPs by project or environment:

# Development setup ncp add --profile dev filesystem npx @modelcontextprotocol/server-filesystem ~/dev # Production setup ncp add --profile prod database npx production-db-server # Use specific profile ncp --profile dev find "file tools"

๐Ÿš€ Project-Level Configuration

New: Configure MCPs per project with automatic detection - perfect for teams and Cloud IDEs:

# In any project directory, create local MCP configuration: mkdir .ncp ncp add filesystem npx @modelcontextprotocol/server-filesystem ./ ncp add github npx @modelcontextprotocol/server-github # NCP automatically detects and uses project-local configuration ncp find "save file" # Uses only project MCPs

How it works:

  • ๐Ÿ“ Local โ†’ Uses project configuration

  • ๐Ÿ  No local โ†’ Falls back to global ~/.ncp

  • ๐ŸŽฏ Zero profile management needed โ†’ Everything goes to default all.json

Perfect for:

  • ๐Ÿค– Claude Code projects (project-specific MCP tooling)

  • ๐Ÿ‘ฅ Team consistency (ship .ncp folder with your repo)

  • ๐Ÿ”ง Project-specific tooling (each project defines its own MCPs)

  • ๐Ÿ“ฆ Environment isolation (no global MCP conflicts)

# Example project structures: frontend-app/ .ncp/profiles/all.json # โ†’ playwright, lighthouse, browser-context src/ api-backend/ .ncp/profiles/all.json # โ†’ postgres, redis, docker, kubernetes server/

HTTP/SSE Transport & Hibernation Support

NCP supports both stdio (local) and HTTP/SSE (remote) MCP servers:

Stdio Transport (Traditional):

# Local MCP servers running as processes ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents

HTTP/SSE Transport (Remote):

{ "mcpServers": { "remote-mcp": { "url": "https://mcp.example.com/api", "auth": { "type": "bearer", "token": "your-token-here" } } } }

๐Ÿ”‹ Hibernation-Enabled Servers:

NCP automatically supports hibernation-enabled MCP servers (like Cloudflare Durable Objects or Metorial):

  • Zero configuration needed - Hibernation works transparently

  • Automatic wake-up - Server wakes on demand when NCP makes requests

  • State preservation - Server state is maintained across hibernation cycles

  • Cost savings - Only pay when MCPs are actively processing requests

How it works:

  1. Server hibernates when idle (consumes zero resources)

  2. NCP sends a request โ†’ Server wakes instantly

  3. Server processes request and responds

  4. Server returns to hibernation after idle timeout

Perfect for:

  • ๐Ÿ’ฐ Cost optimization - Only pay for active processing time

  • ๐ŸŒ Cloud-hosted MCPs - Metorial, Cloudflare Workers, serverless platforms

  • โ™ป๏ธ Resource efficiency - No idle server costs

  • ๐Ÿš€ Scale to zero - Servers automatically sleep when not needed

Note: Hibernation is a server-side feature. NCP's standard HTTP/SSE client automatically works with both traditional and hibernation-enabled servers without any special configuration.

Photon Runtime (CLI vs DXT)

The TypeScript Photon runtime is enabled by default, but the toggle lives in different places depending on how you run NCP:

  • CLI / npm installs: Edit ~/.ncp/settings.json (or run ncp config) and set enablePhotonRuntime: true or false. You can also override adโ€‘hoc with NCP_ENABLE_PHOTON_RUNTIME=true ncp find "photon".

  • DXT / client bundles (Claude Desktop, Cursor, etc.): These builds ignore ~/.ncp/settings.json. Configure photons by setting the env var inside the client config:

{ "mcpServers": { "ncp": { "command": "ncp", "env": { "NCP_ENABLE_PHOTON_RUNTIME": "true" } } } }

If you disable the photon runtime, internal MCPs continue to work, but .photon.ts files are ignored until you re-enable the flag.

Import from Anywhere

# From clipboard (any JSON config) ncp config import # From specific file ncp config import "~/my-mcp-config.json" # From Claude Desktop (auto-detected paths) ncp config import

๐Ÿ›Ÿ Troubleshooting

Import Issues

# Check what was imported ncp list # Validate health of imported MCPs ncp config validate # See detailed import logs DEBUG=ncp:* ncp config import

AI Not Using Tools

  • Check connection: ncp list (should show your MCPs)

  • Test discovery: ncp find "your query"

  • Validate config: Ensure your AI client points to ncp command

Performance Issues

# Check MCP health (unhealthy MCPs slow everything down) ncp list --depth 1 # Clear cache if needed rm -rf ~/.ncp/cache # Monitor with debug logs DEBUG=ncp:* ncp find "test"

๐ŸŒ“ Why We Built This

Like Yin and Yang, everything relies on the balance of things.

Compute gives us precision and certainty. AI gives us creativity and probability.

We believe breakthrough products emerge when you combine these forces in the right ratio.

How NCP embodies this balance:

What NCP Does

AI (Creativity)

Compute (Precision)

The Balance

Tool Discovery

Understands "read a file" semantically

Routes to exact tool deterministically

Natural request โ†’ Precise execution

Orchestration

Flexible to your intent

Reliable tool execution

Natural flow โ†’ Certain outcomes

Health Monitoring

Adapts to patterns

Monitors connections, auto-failover

Smart adaptation โ†’ Reliable uptime

Neither pure AI (too unpredictable) nor pure compute (too rigid).

Your AI stays creative. NCP handles the precision.


๐Ÿ“š Deep Dive: How It Works

Want the technical details? Token analysis, architecture diagrams, and performance benchmarks:

๐Ÿ“– Read the Technical Guide โ†’

Learn about:

  • Vector similarity search algorithms

  • N-to-1 orchestration architecture

  • Real-world token usage comparisons

  • Health monitoring and failover systems


๐Ÿค Contributing

Help make NCP even better:


๐Ÿ“„ License

Elastic License 2.0 - Full License

TLDR: Free for all use including commercial. Cannot be offered as a hosted service to third parties.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/portel-dev/ncp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server