# Custom MCP Server π€
A Model Context Protocol (MCP) server built with Next.js, providing useful tools and utilities through both HTTP and Server-Sent Events (SSE) transports.
## π Features
### π§ Available Tools
- **echo** - Echo any message back (perfect for testing)
- **get-current-time** - Get the current timestamp and ISO date
- **calculate** - Perform basic mathematical calculations safely
### π Transport Methods
- **HTTP Transport** (`/mcp`) - Stateless HTTP requests (works without Redis)
- **SSE Transport** (`/sse`) - Server-Sent Events with Redis for state management
### π Security Features
- Rate limiting (100 requests per minute)
- Safe mathematical expression evaluation
- Input sanitization and validation
## πββοΈ Quick Start
### Prerequisites
- Node.js 18+
- npm or yarn
- Docker (optional, for local Redis)
### Setup
1. **Clone and install dependencies:**
```bash
npm install
```
2. **Run the automated setup:**
```bash
npm run setup
```
This will:
- Create environment configuration
- Set up Redis (Docker) if available
- Start the development server automatically
3. **Manual start (alternative):**
```bash
npm run dev
```
The server will be available at `http://localhost:3000`
## π§ͺ Testing
### Quick Tests
```bash
# Test HTTP transport
npm run test:http
# Test SSE transport (requires Redis)
npm run test:sse
# Test with Claude Desktop protocol
npm run test:stdio
# Comprehensive tool testing
npm run test:tools
```
### Manual Testing
You can test the MCP server manually using curl:
```bash
# List available tools
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}'
# Call the echo tool
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "echo",
"arguments": {
"message": "Hello World!"
}
}
}'
# Calculate an expression
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "calculate",
"arguments": {
"expression": "15 * 4 + 10"
}
}
}'
```
## π§ Configuration
### Environment Variables
Create a `.env.local` file:
```env
# Local Redis (Docker)
REDIS_URL=redis://localhost:6379
# Upstash Redis (Production)
UPSTASH_REDIS_REST_URL=your-upstash-url
UPSTASH_REDIS_REST_TOKEN=your-upstash-token
```
### Redis Setup
The server automatically detects and uses Redis in this priority order:
1. **Upstash Redis** (if `UPSTASH_REDIS_REST_URL` and `UPSTASH_REDIS_REST_TOKEN` are set)
2. **Local Redis** (if `REDIS_URL` is set)
3. **No Redis** (HTTP transport only)
#### Local Redis with Docker
```bash
# The setup script handles this automatically, but you can also run manually:
docker run -d --name redis-mcp -p 6379:6379 redis:alpine
```
#### Upstash Redis (Recommended for Production)
1. Create an Upstash Redis database at [upstash.com](https://upstash.com)
2. Add the connection details to your `.env.local`
3. The server will automatically detect and use it
## π₯οΈ Integration with AI Tools
### Claude Desktop
Add to your Claude Desktop configuration (`claude_desktop_config.json`):
```json
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
```
**Configuration file locations:**
- **macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
- **Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
### Cursor IDE
For Cursor 0.48.0 or later (direct SSE support):
```json
{
"mcpServers": {
"custom-mcp": {
"url": "http://localhost:3000/sse"
}
}
}
```
For older Cursor versions:
```json
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
```
## π οΈ Development
### Project Structure
```
custom-mcp-server/
βββ app/
β βββ [transport]/
β β βββ route.ts # Main MCP server logic
β βββ layout.tsx # Root layout
β βββ page.tsx # Home page
βββ lib/
β βββ redis.ts # Redis utilities
βββ scripts/
β βββ setup.mjs # Automated setup
β βββ test-http-client.mjs # HTTP transport tests
β βββ test-sse-client.mjs # SSE transport tests
β βββ test-tools.mjs # Comprehensive tool tests
βββ package.json
βββ next.config.ts
βββ README.md
```
### Adding New Tools
1. **Define the tool** in `app/[transport]/route.ts`:
```typescript
const tools = {
// ... existing tools
myNewTool: {
name: "my-new-tool",
description: "Description of what your tool does",
inputSchema: {
type: "object",
properties: {
param1: {
type: "string",
description: "Description of parameter"
}
},
required: ["param1"]
}
}
};
```
2. **Add the handler**:
```typescript
const toolHandlers = {
// ... existing handlers
"my-new-tool": async ({ param1 }: { param1: string }) => {
// Your tool logic here
return {
content: [
{
type: "text",
text: `Result: ${param1}`
}
]
};
}
};
```
### Testing Your Changes
```bash
# Run all tests
npm run test:tools
# Test specific functionality
npm run test:http
npm run test:sse
```
## π API Reference
### Tools/List
Get all available tools:
```json
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}
```
### Tools/Call
Call a specific tool:
```json
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "tool-name",
"arguments": {
"param": "value"
}
}
}
```
## π Deployment
### Vercel (Recommended)
1. **Deploy to Vercel:**
```bash
vercel
```
2. **Add environment variables in Vercel dashboard:**
- `UPSTASH_REDIS_REST_URL`
- `UPSTASH_REDIS_REST_TOKEN`
3. **Update your AI tool configurations** to use the deployed URL:
```
https://your-app.vercel.app/mcp
https://your-app.vercel.app/sse
```
### Other Platforms
The server is a standard Next.js application and can be deployed to any platform that supports Node.js:
- Netlify
- Railway
- Render
- DigitalOcean App Platform
## π€ Contributing
1. Fork the repository
2. Create a feature branch: `git checkout -b feature/my-new-feature`
3. Make your changes and add tests
4. Run the test suite: `npm run test:tools`
5. Commit your changes: `git commit -am 'Add some feature'`
6. Push to the branch: `git push origin feature/my-new-feature`
7. Submit a pull request
## π License
MIT License - see LICENSE file for details.
## π Troubleshooting
### Common Issues
**Server not starting:**
- Check if port 3000 is available
- Ensure all dependencies are installed: `npm install`
**Redis connection issues:**
- Verify Docker is running: `docker ps`
- Check Redis container status: `docker ps -a | grep redis-mcp`
- Restart Redis: `docker restart redis-mcp`
**AI tool not detecting server:**
- Ensure the server is running and accessible
- Check the configuration file syntax (valid JSON)
- Restart your AI tool after configuration changes
- Verify the server URL is correct
**Tool calls failing:**
- Check server logs for error messages
- Test tools manually with `npm run test:tools`
- Verify the tool parameters match the expected schema
### Debug Mode
Enable debug logging by setting the environment variable:
```bash
DEBUG=1 npm run dev
```
## π Support
- Create an issue on GitHub for bug reports
- Check existing issues for common problems
- Review the test scripts for usage examples