Skip to main content
Glama

Chris MCP

A context provider on how I program. Basically the AI version of me for AI utilities like cline.

1. Install

The following sections describe several ways to install this MCP.

Make sure you are using Node version 22.

1.1. Option 1: Using NPX

Run the following commands in the same folder your other MCP servers are.

$ mkdir chris-mcp
$ cd chris-mcp
$ npx --y chris-mcp fetch --output ./data
$ npx --y chris-mcp verify --output ./data
$ pwd

Copy the response from pwd and edit your MCP server configuration by following one of the options below.

1.1.1 Using NPX With Claude Desktop

Add the following configuration to your claude_desktop_config.json where [pwd] is the response from the pwd command earlier.

{
  "name": "github.com/cblanquera/mcp",
  "command": "npx",
  "args": [ 
    "-y", 
    "chris-mcp", 
    "serve", 
    "--input", 
    "[pwd]/data" 
  ]
}

1.1.2 Using NPX With Cline

Add the following configuration to your cline_mcp_settings.json where [pwd] is the response from the pwd command earlier.

{
  "mcpServers": {
    "github.com/cblanquera/mcp": {
      "command": "npx",
      "args": [ 
        "-y", 
        "chris-mcp", 
        "serve", 
        "--input", 
        "[pwd]/data" 
      ]
    }
  }
}

1.2. Option 2: Direct From the Repository

Run the following commands in the same folder your other MCP servers are.

$ git clone https://github.com/cblanquera/mcp.git chris-mcp
$ cd chris-mcp
$ npm i
$ npm run build
$ npm run fetch --output ./data
$ npm run verify --output ./data
$ pwd

Copy the response from pwd and edit your MCP server configuration by following one of the options below.

1.2.1. From the Repository With Claude Desktop

Add the following configuration to your claude_desktop_config.json.

{
  "name": "github.com/cblanquera/mcp",
  "command": "node",
  "args": [ 
    "[pwd]/dist/scripts/serve.js", 
    "--input", 
    "[pwd]/data" 
  ]
}

1.2.2. From the Repository With Cline

Add the following configuration to your cline_mcp_settings.json.

{
  "mcpServers": {
    "github.com/cblanquera/mcp": {
      "command": "node",
      "args": [ 
        "[pwd]/dist/scripts/serve.js", 
        "--input", 
        "[pwd]/data" 
      ]
    }
  }
}

1.3. From Prompt

  1. Copy and paste the following prompt.

Set up the MCP server from https://github.com/cblanquera/mcp while adhering to these MCP server installation rules:
- Start by loading the MCP documentation.
- Use "github.com/cblanquera/mcp" as the server name in cline_mcp_settings.json.
- Create the directory for the new MCP server before starting installation.
- Make sure you read the user's existing cline_mcp_settings.json file before editing it with this new mcp, to not overwrite any existing servers.
- Use commands aligned with the user's shell and operating system best practices.
- Once installed, demonstrate the server's capabilities by using one of its tools.
Here is the project's README to help you get started:
  1. Then paste in this README.

Related MCP server: Context Continuation MCP Server

2. Usage

You can manually start the server like the following.

$ npm start --input [pwd]/data

Or you can run it manually like the following.

$ node [pwd]/dist/scripts/serve.js --input [pwd]/data

If you installed via npx, you can start the server like the following.

$ npx chris-mcp serve --input [pwd]/data

2.1. Fetching Updated Context

You can manually fetch and verify the context like the following.

$ npm run fetch --output [pwd]/data
$ npm run verify --output [pwd]/data

Or you can run it manually like the following.

$ node [pwd]/dist/scripts/fetch.js --output [pwd]/data
$ node [pwd]/dist/scripts/verify.js --output [pwd]/data

If you installed via npx, you can start the server like the following.

$ npx chris-mcp fetch --output [pwd]/data
$ npx chris-mcp verify --output [pwd]/data

2.2. Upgrading Search Model

The MCP uses Xenova/all-MiniLM-L6-v2 locally to determine the best search query term for the MCP. Think about it like random prompt → correct query → ask MCP. You can upgrade this to use your OpenAI key by adding OPENAI_HOST, OPENAI_KEY and EMBEDDING_MODEL environment variables in your MCP settings like the following.

{
  "name": "chris-context",
  "command": "node",
  "command": "npx",
  "args": [ 
    "-y", 
    "chris-mcp", 
    "serve", 
    "--input", 
    "[pwd]/data" 
  ],
  "env": {
    "OPENAI_HOST": "https://api.openai.com/v1",
    "OPENAI_KEY": "sk-xxx",
    "EMBEDDING_MODEL": "text-embedding-3-small"
  }
}

WARNING: OpenRouter doesn't support the /embeddings API endpoint. This is called when providing an OpenAI compatible host.

3. Maximizing Your Knowledge Base

Create a rule (markdown file) called Chris-MCP-Rule.md in your knowledge folder (ex. .clinerules) with the following context.

# Rule: Using the Chris MCP

- If the user mentions "chris" and asks about code formatting, coding styles, coding standards, documentation styles, testing styles, use the MCP tool `chris-context.search_context`.
- If the user asks for a compact summary of rules for code formatting, writing documentation, writing tests, use the MCP tool `chris-context.build_brief`.
- Always prefer these MCP tools over answering from memory.
-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cblanquera/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server