Skip to main content
Glama

Chris MCP

by cblanquera

Chris MCP

A context provider on how I program. Basically the AI version of me for AI utilities like cline.

1. Install

The following sections describe several ways to install this MCP.

Make sure you are using Node version 22.

1.1. Option 1: Using NPX

Run the following commands in the same folder your other MCP servers are.

$ mkdir chris-mcp $ cd chris-mcp $ npx --y chris-mcp fetch --output ./data $ npx --y chris-mcp verify --output ./data $ pwd

Copy the response from pwd and edit your MCP server configuration by following one of the options below.

1.1.1 Using NPX With Claude Desktop

Add the following configuration to your claude_desktop_config.json where [pwd] is the response from the pwd command earlier.

{ "mcpServers": { "chris-context": { "command": "npx", "args": [ "-y", "chris-mcp", "serve", "--input", "[pwd]/data" ] } } }
1.1.2 Using NPX With Cline

Add the following configuration to your cline_mcp_settings.json where [pwd] is the response from the pwd command earlier.

{ "name": "chris-context", "command": "npx", "args": [ "-y", "chris-mcp", "serve", "--input", "[pwd]/data" ] }

1.2. Option 2: Direct From the Repository

Run the following commands in the same folder your other MCP servers are.

$ git clone https://github.com/cblanquera/mcp.git chris-mcp $ cd chris-mcp $ npm i $ npm run build $ npm run fetch --output ./data $ npm run verify --output ./data $ pwd

Copy the response from pwd and edit your MCP server configuration by following one of the options below.

1.2.1. From the Repository With Claude Desktop

Add the following configuration to your claude_desktop_config.json.

{ "mcpServers": { "chris-context": { "command": "node", "args": [ "[pwd]/dist/scripts/serve.js", "--input", "[pwd]/data" ] } } }
1.2.2. From the Repository With Cline

Add the following configuration to your cline_mcp_settings.json.

{ "name": "chris-context", "command": "node", "args": [ "[pwd]/dist/scripts/serve.js", "--input", "[pwd]/data" ] }

1.3. From Prompt

  1. Copy and paste the following prompt.
Set up the MCP server from https://github.com/cblanquera/mcp while adhering to these MCP server installation rules: - Start by loading the MCP documentation. - Use "github.com/cblanquera/mcp" as the server name in cline_mcp_settings.json. - Create the directory for the new MCP server before starting installation. - Make sure you read the user's existing cline_mcp_settings.json file before editing it with this new mcp, to not overwrite any existing servers. - Use commands aligned with the user's shell and operating system best practices. - Once installed, demonstrate the server's capabilities by using one of its tools. Here is the project's README to help you get started:
  1. Then paste in this README.

2. Usage

You can manually start the server like the following.

$ npm start --input [pwd]/.data

Or you can run it manually like the following.

$ node [pwd]/dist/scripts/serve.js --input [pwd]/.data

If you installed via npx, you can start the server like the following.

$ npx chris-mcp serve --input [pwd]/.data

2.1. Fetching Updated Context

You can manually fetch and verify the context like the following.

$ npm run fetch --output [pwd]/.data $ npm run verify --output [pwd]/.data

Or you can run it manually like the following.

$ node [pwd]/dist/scripts/fetch.js --output [pwd]/.data $ node [pwd]/dist/scripts/verify.js --output [pwd]/.data

If you installed via npx, you can start the server like the following.

$ npx chris-mcp fetch --output [pwd]/.data $ npx chris-mcp verify --output [pwd]/.data

2.2. Upgrading Search Model

The MCP uses Xenova/all-MiniLM-L6-v2 locally to determine the best search query term for the MCP. Think about it like random prompt → correct query → ask MCP. You can upgrade this to use your OpenAI key by adding OPENAI_HOST, OPENAI_KEY and EMBEDDING_MODEL environment variables in your MCP settings like the following.

{ "name": "chris-context", "command": "node", "command": "npx", "args": [ "-y", "chris-mcp", "serve", "--input", "[pwd]/.data" ], "env": { "OPENAI_HOST": "https://api.openai.com/v1", "OPENAI_KEY": "sk-xxx", "EMBEDDING_MODEL": "text-embedding-3-small" } }

WARNING: OpenRouter doesn't support the /embeddings API endpoint. This is called when providing an OpenAI compatible host.

3. Maximizing Your Knowledge Base

Create a rule (markdown file) called Chris-MCP-Rule.md in your knowledge folder (ex. .clinerules) with the following context.

# Rule: Using the Chris MCP If the users mentions topics about: - coding with javascript - coding with typescript - coding with react - coding with idea - coding with ingest - coding with inquire - coding with reactus - coding with stackpress - markdown documentation - testing with mocha - testing with chai - testing with jest you must do the following. - If the user asks about library rules, guidelines, or context, use the MCP tool `chris-context.search_context`. - If the user asks for a compact summary of rules for a task, use the MCP tool `chris-context.build_brief`. - Always prefer these MCP tools over answering from memory.
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A context provider that serves as an AI version of Chris's programming knowledge and practices. Enables AI utilities like Claude and Cline to search and access coding guidelines, rules, and context for JavaScript, TypeScript, React, and various development frameworks.

  1. 1. Install
    1. 1.1. Option 1: Using NPX
    2. 1.2. Option 2: Direct From the Repository
    3. 1.3. From Prompt
  2. 2. Usage
    1. 2.1. Fetching Updated Context
      1. 2.2. Upgrading Search Model
        1. 3. Maximizing Your Knowledge Base

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol server that enables AI assistants like Claude to browse and analyze Reddit content, including searching subreddits, retrieving post details with comments, and viewing trending posts.
            Last updated -
            8
            MIT License
            • Apple
          • A
            security
            A
            license
            A
            quality
            Provides intelligent context management for AI development sessions, allowing users to track token usage, manage conversation context, and seamlessly restore context when reaching token limits.
            Last updated -
            8
            0
            2
            Apache 2.0
            • Linux
            • Apple
          • -
            security
            F
            license
            -
            quality
            A personal AI coding assistant that connects to various development environments and helps automate tasks, provide codebase insights, and improve coding decisions by leveraging the Model Context Protocol.
            Last updated -
            • Apple
            • Linux
          • -
            security
            A
            license
            -
            quality
            A platform that transforms AI development with intelligent context management, optimization, and prompt engineering, enabling developers to enhance model performance through structured context management and optimization tools.
            Last updated -
            MIT License

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/cblanquera/mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server