Skip to main content
Glama
Meerisha

Custom MCP Server

by Meerisha

Custom MCP Server πŸ€–

A Model Context Protocol (MCP) server built with Next.js, providing useful tools and utilities through both HTTP and Server-Sent Events (SSE) transports.

πŸš€ Features

πŸ”§ Available Tools

  • echo - Echo any message back (perfect for testing)

  • get-current-time - Get the current timestamp and ISO date

  • calculate - Perform basic mathematical calculations safely

🌐 Transport Methods

  • HTTP Transport (/mcp) - Stateless HTTP requests (works without Redis)

  • SSE Transport (/sse) - Server-Sent Events with Redis for state management

πŸ”’ Security Features

  • Rate limiting (100 requests per minute)

  • Safe mathematical expression evaluation

  • Input sanitization and validation

πŸƒβ€β™‚οΈ Quick Start

Prerequisites

  • Node.js 18+

  • npm or yarn

  • Docker (optional, for local Redis)

Setup

  1. Clone and install dependencies:

    npm install
  2. Run the automated setup:

    npm run setup

    This will:

    • Create environment configuration

    • Set up Redis (Docker) if available

    • Start the development server automatically

  3. Manual start (alternative):

    npm run dev

The server will be available at http://localhost:3000

πŸ§ͺ Testing

Quick Tests

# Test HTTP transport npm run test:http # Test SSE transport (requires Redis) npm run test:sse # Test with Claude Desktop protocol npm run test:stdio # Comprehensive tool testing npm run test:tools

Manual Testing

You can test the MCP server manually using curl:

# List available tools curl -X POST http://localhost:3000/mcp \ -H "Content-Type: application/json" \ -d '{ "jsonrpc": "2.0", "id": 1, "method": "tools/list" }' # Call the echo tool curl -X POST http://localhost:3000/mcp \ -H "Content-Type: application/json" \ -d '{ "jsonrpc": "2.0", "id": 2, "method": "tools/call", "params": { "name": "echo", "arguments": { "message": "Hello World!" } } }' # Calculate an expression curl -X POST http://localhost:3000/mcp \ -H "Content-Type: application/json" \ -d '{ "jsonrpc": "2.0", "id": 3, "method": "tools/call", "params": { "name": "calculate", "arguments": { "expression": "15 * 4 + 10" } } }'

πŸ”§ Configuration

Environment Variables

Create a .env.local file:

# Local Redis (Docker) REDIS_URL=redis://localhost:6379 # Upstash Redis (Production) UPSTASH_REDIS_REST_URL=your-upstash-url UPSTASH_REDIS_REST_TOKEN=your-upstash-token

Redis Setup

The server automatically detects and uses Redis in this priority order:

  1. Upstash Redis (if UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN are set)

  2. Local Redis (if REDIS_URL is set)

  3. No Redis (HTTP transport only)

Local Redis with Docker

# The setup script handles this automatically, but you can also run manually: docker run -d --name redis-mcp -p 6379:6379 redis:alpine
  1. Create an Upstash Redis database at upstash.com

  2. Add the connection details to your .env.local

  3. The server will automatically detect and use it

πŸ–₯️ Integration with AI Tools

Claude Desktop

Add to your Claude Desktop configuration (claude_desktop_config.json):

{ "mcpServers": { "custom-mcp": { "command": "npx", "args": [ "-y", "mcp-remote", "http://localhost:3000/mcp" ] } } }

Configuration file locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Cursor IDE

For Cursor 0.48.0 or later (direct SSE support):

{ "mcpServers": { "custom-mcp": { "url": "http://localhost:3000/sse" } } }

For older Cursor versions:

{ "mcpServers": { "custom-mcp": { "command": "npx", "args": [ "-y", "mcp-remote", "http://localhost:3000/mcp" ] } } }

πŸ› οΈ Development

Project Structure

custom-mcp-server/ β”œβ”€β”€ app/ β”‚ β”œβ”€β”€ [transport]/ β”‚ β”‚ └── route.ts # Main MCP server logic β”‚ β”œβ”€β”€ layout.tsx # Root layout β”‚ └── page.tsx # Home page β”œβ”€β”€ lib/ β”‚ └── redis.ts # Redis utilities β”œβ”€β”€ scripts/ β”‚ β”œβ”€β”€ setup.mjs # Automated setup β”‚ β”œβ”€β”€ test-http-client.mjs # HTTP transport tests β”‚ β”œβ”€β”€ test-sse-client.mjs # SSE transport tests β”‚ └── test-tools.mjs # Comprehensive tool tests β”œβ”€β”€ package.json β”œβ”€β”€ next.config.ts └── README.md

Adding New Tools

  1. Define the tool in app/[transport]/route.ts:

const tools = { // ... existing tools myNewTool: { name: "my-new-tool", description: "Description of what your tool does", inputSchema: { type: "object", properties: { param1: { type: "string", description: "Description of parameter" } }, required: ["param1"] } } };
  1. Add the handler:

const toolHandlers = { // ... existing handlers "my-new-tool": async ({ param1 }: { param1: string }) => { // Your tool logic here return { content: [ { type: "text", text: `Result: ${param1}` } ] }; } };

Testing Your Changes

# Run all tests npm run test:tools # Test specific functionality npm run test:http npm run test:sse

πŸ“ API Reference

Tools/List

Get all available tools:

{ "jsonrpc": "2.0", "id": 1, "method": "tools/list" }

Tools/Call

Call a specific tool:

{ "jsonrpc": "2.0", "id": 2, "method": "tools/call", "params": { "name": "tool-name", "arguments": { "param": "value" } } }

πŸš€ Deployment

  1. Deploy to Vercel:

    vercel
  2. Add environment variables in Vercel dashboard:

    • UPSTASH_REDIS_REST_URL

    • UPSTASH_REDIS_REST_TOKEN

  3. Update your AI tool configurations to use the deployed URL:

    https://your-app.vercel.app/mcp https://your-app.vercel.app/sse

Other Platforms

The server is a standard Next.js application and can be deployed to any platform that supports Node.js:

  • Netlify

  • Railway

  • Render

  • DigitalOcean App Platform

🀝 Contributing

  1. Fork the repository

  2. Create a feature branch: git checkout -b feature/my-new-feature

  3. Make your changes and add tests

  4. Run the test suite: npm run test:tools

  5. Commit your changes: git commit -am 'Add some feature'

  6. Push to the branch: git push origin feature/my-new-feature

  7. Submit a pull request

πŸ“„ License

MIT License - see LICENSE file for details.

πŸ†˜ Troubleshooting

Common Issues

Server not starting:

  • Check if port 3000 is available

  • Ensure all dependencies are installed: npm install

Redis connection issues:

  • Verify Docker is running: docker ps

  • Check Redis container status: docker ps -a | grep redis-mcp

  • Restart Redis: docker restart redis-mcp

AI tool not detecting server:

  • Ensure the server is running and accessible

  • Check the configuration file syntax (valid JSON)

  • Restart your AI tool after configuration changes

  • Verify the server URL is correct

Tool calls failing:

  • Check server logs for error messages

  • Test tools manually with npm run test:tools

  • Verify the tool parameters match the expected schema

Debug Mode

Enable debug logging by setting the environment variable:

DEBUG=1 npm run dev

πŸ“ž Support

  • Create an issue on GitHub for bug reports

  • Check existing issues for common problems

  • Review the test scripts for usage examples

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Meerisha/custom-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server