Used for API key and environment variable management, allowing secure storage of credentials for various LLM providers.
Connects with companion web application hosted on GitHub for easier management and interaction with the server.
Enables switching to Google Gemini as an LLM provider for executing logic primitives and cognitive operations through dynamic LLM configuration.
Stores all operations and their relationships in a database, enabling history reconstruction and logic chain visualization.
Provides demonstration videos showing the server solving logic puzzles, linked directly from the documentation.
Logic MCP Server
Overview
The logic-mcp
server is a backend application designed to execute advanced logic primitives and cognitive operations. It leverages the Model Context Protocol (MCP) to provide tools for reasoning, data processing, and interaction with Large Language Models (LLMs). This server forms the core engine for complex task execution and structured thought processing.
It features dynamic LLM configuration, allowing users to switch between different language models and providers (like OpenRouter, Google Gemini, etc.) via API calls or a companion web application. All operations and their relationships are traced and stored in a SQLite database, enabling history reconstruction and logic chain visualization.
This server is intended to be used in conjunction with the Logic MCP Webapp for easier management and interaction.
Demonstration: Logic Puzzle Solving
Watch a demonstration of the Logic MCP server attempting to solve the "Passport Pandemonium" logic puzzle:
Features
- Model Context Protocol (MCP) Server: Exposes logic operations as tools.
- Dynamic LLM Configuration:
- Add, activate, and delete LLM provider configurations (e.g., OpenRouter, Gemini).
- Server uses the currently active LLM configuration.
- Falls back to a default LLM if no user configuration is active.
- Logic Primitives: Supports operations like
define
,infer
,decide
,synthesize
, etc. (extensible). - Database Tracing: All operations and logic chains are stored in a SQLite database for traceability and history.
- HTTP API:
- Manage LLM configurations (
/api/llm-config
). - Explore logic chains and operations (
/api/logic-explorer
).
- Manage LLM configurations (
- Environment Variable Management: Uses a
.env
file for API keys.
Companion Web Application
A web application is available to interact with this server, manage LLM configurations, and explore logic chains:
- Repository: Mnehmos/logic-mcp-webapp
- Functionality:
- View and manage LLM provider configurations.
- Activate specific LLM configurations for the server to use.
- View executed logic chains and their operations.
- Clear LLM configurations and logic chain history.
Getting Started
Prerequisites
- Node.js (version recommended by your project, e.g., v18+)
- npm or yarn
Installation
- Clone the repository:
- Install dependencies:
- Set up environment variables:
- Copy
.env.example
to.env
(if an example file exists, otherwise create.env
). - Fill in the required API keys, especially
OPENROUTER_API_KEY
for the default LLM and any other providers you intend to use (e.g.,GEMINI_API_KEY
).
- Copy
Running the Server
- Compile TypeScript:
- Start the server:
The MCP server will start on stdio, and the HTTP API will be available (default: http://localhost:3001
).
API Endpoints
- LLM Configurations:
GET, POST, PUT, DELETE /api/llm-config
- Activate:
PATCH /api/llm-config/:id/activate
- Activate:
- Logic Explorer:
GET /api/logic-explorer/chains
,GET /api/logic-explorer/chains/:chainId
, etc.
Contributing
Contributions are welcome! Please feel free to submit pull requests or open issues.
This README provides a basic overview. Further details on specific primitives, API usage, and advanced configurations will be added as the project evolves.
Example Configuration
Below is an example runtime configuration for the logic-mcp server as it would appear in an MCP settings file:
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A backend server that executes advanced logic primitives and cognitive operations through the Model Context Protocol, allowing integration with various LLM providers for reasoning, data processing, and structured thought processing.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that provides access to BigQuery. This server enables LLMs to inspect database schemas and execute queries.Last updated -92PythonMIT License
- AsecurityAlicenseAqualityThis server implements the Model Context Protocol to facilitate meaningful interaction and understanding development between humans and AI through structured tools and progressive interaction patterns.Last updated -1322TypeScriptMIT License
- -securityAlicense-qualityA Model Context Protocol server that integrates LLMs with Frontapp's customer communication platform, enabling access to conversations, contacts, and tags while supporting real-time updates via webhooks.Last updated -TypeScriptMIT License
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.Last updated -TypeScriptMIT License