Used for running the server with commands like 'bun install', 'bun run build', and 'bun run dev'
Provides containerization and simplified deployment to EC2 or other server environments with included Docker configuration
Used for cloning the repository during deployment to EC2
Used for type-safe implementation of the MCP server, with build step to compile TypeScript code
Mentioned as a potential OS for EC2 deployment alongside Amazon Linux 2
LLM Responses MCP Server
A Model Context Protocol (MCP) server that allows multiple AI agents to share and read each other's responses to the same prompt.
Overview
This project implements an MCP server with two main tool calls:
submit-response
: Allows an LLM to submit its response to a promptget-responses
: Allows an LLM to retrieve all responses from other LLMs for a specific prompt
This enables a scenario where multiple AI agents can be asked the same question by a user, and then using these tools, the agents can read and reflect on what other LLMs said to the same question.
Installation
Development
Testing with MCP Inspector
The project includes support for the MCP Inspector, which is a tool for testing and debugging MCP servers.
The inspect
script uses npx
to run the MCP Inspector, which will launch a web interface in your browser for interacting with your MCP server.
This will allow you to:
- Explore available tools and resources
- Test tool calls with different parameters
- View the server's responses
- Debug your MCP server implementation
Usage
The server exposes two endpoints:
/sse
- Server-Sent Events endpoint for MCP clients to connect/messages
- HTTP endpoint for MCP clients to send messages
MCP Tools
submit-response
Submit an LLM's response to a prompt:
get-responses
Retrieve all LLM responses, optionally filtered by prompt:
License
MIT
Deployment to EC2
This project includes Docker configuration for easy deployment to EC2 or any other server environment.
Prerequisites
- An EC2 instance running Amazon Linux 2 or Ubuntu
- Security group configured to allow inbound traffic on port 62886
- SSH access to the instance
Deployment Steps
- Clone the repository to your EC2 instance:
- Make the deployment script executable:
- Run the deployment script:
The script will:
- Install Docker and Docker Compose if they're not already installed
- Build the Docker image
- Start the container in detached mode
- Display the public URL where your MCP server is accessible
Manual Deployment
If you prefer to deploy manually:
- Build the Docker image:
- Start the container:
- Verify the container is running:
Accessing the Server
Once deployed, your MCP server will be accessible at:
http://<ec2-public-ip>:62886/sse
- SSE endpointhttp://<ec2-public-ip>:62886/messages
- Messages endpoint
Make sure port 62886 is open in your EC2 security group!
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Enables multiple AI agents to share and read each other's responses to the same prompt, allowing them to reflect on what other LLMs said to the same question.
Related MCP Servers
- AsecurityAlicenseAqualityEnables integration of Perplexity's AI API with LLMs, delivering advanced chat completion by utilizing specialized prompt templates for tasks like technical documentation, code review, and API documentation.Last updated -15788JavaScriptMIT License
- -securityFlicense-qualityFacilitates enhanced interaction with large language models (LLMs) by providing intelligent context management, tool integration, and multi-provider AI model coordination for efficient AI-driven workflows.Last updated -Python
- -securityFlicense-qualityEnables communication and coordination between different LLM agents across multiple systems, allowing specialized agents to collaborate on tasks, share context, and coordinate work through a unified platform.Last updated -6TypeScript
- -securityAlicense-qualityAn MCP server that allows agents to test and compare LLM prompts across OpenAI and Anthropic models, supporting single tests, side-by-side comparisons, and multi-turn conversations.Last updated -PythonMIT License