Skip to main content
Glama
Lucineer

L.O.G. (Latent Orchestration Gateway)

by Lucineer

LOG-mcp

Stop guessing which AI model to use. This MCP server builds a dataset of your preferences each time you choose between draft responses. It learns from your actual choices, not from general benchmarks.

Live URL: https://log-mcp.casey-digennaro.workers.dev License: MIT • Runtime: Cloudflare Workers • Dependencies: 0

Why This Exists

Public model rankings often don't reflect your specific needs. This server learns your preferences directly from the choices you make while working, helping it route future prompts to the model you'd likely choose.

Quick Start

Fork this repository first to create your own instance.

  1. Fork this repository.

  2. Deploy to Cloudflare Workers using the one-click button in your fork.

  3. Add your model API keys in the Worker's environment variables.

For local development:

git clone https://github.com/your-username/log-mcp
cd log-mcp
cp .env.example .env
# Add your API keys to the .env file
npm run dev

How It Works

When you submit a prompt, LOG-mcp generates draft responses from each configured model. You select the best one. Each choice trains your private preference profile. Over time, it begins routing prompts directly to the model you would have selected. All choice data remains within your Worker.

Features

  • Preference Learning: Routes prompts based on your past choices.

  • PII Stripping: Removes identifiable information (names, emails, phone numbers) by default before sending to model APIs.

  • Semantic Caching: Skips regeneration for semantically similar, previously answered prompts.

  • Training Data Export: Export your preference pairs in standard LoRA/DPO format.

  • Multi-Model Support: Configure endpoints for OpenAI, Anthropic, Google, and open-weight models.

  • MCP Compliance: Works with any client supporting the Model Context Protocol.

  • Zero Dependencies: A single, self-contained script that deploys quickly.

Limitations

The system requires explicit choice data to learn. If you rarely select between drafts, it cannot build an effective routing profile and will continue to show all model outputs.

Architecture

LOG-mcp runs statelessly on Cloudflare Workers. All preference data is stored in a Cloudflare KV namespace. There are no external databases or background processes to manage.

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Lucineer/LOG-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server