Skip to main content
Glama

Using local LLMs for code writing, reviewing, and rule generation MCP servers

Production-ready MCP servers that extend AI capabilities through file access, database connections, APIs, and contextual services.

17,617 servers. Last updated 2026-02-21 20:54

Matching MCP tools:

Matching MCP Connectors:

"Using local LLMs for code writing, reviewing, and rule generation" matching MCP servers:

  • -
    security
    A
    license
    -
    quality
    An MCP server that exposes the llms.txt file and its referenced local or external resources from a project root to provide context for AI models. It automatically parses documentation links and URLs to make them accessible as additional MCP resources.
    Last updated 8 months ago
    1
    MIT
  • -
    security
    A
    license
    -
    quality
    Enables fast, token-efficient access to large documentation files in llms.txt format through semantic search. Solves token limit issues by searching first and retrieving only relevant sections instead of dumping entire documentation.
    Last updated 6 months ago
    2
    MIT
  • -
    security
    A
    license
    -
    quality
    A 100% local development monitoring tool that captures browser console logs, network requests, and backend server output for analysis by AI assistants via MCP. It enables LLMs to debug applications by providing structured, real-time access to full-stack log data and persistent local storage.
    Last updated 7 months ago
    2
    5
    MIT
  • A
    security
    A
    license
    A
    quality
    An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
    Last updated 10 months ago
    8
    27
    179
    MIT

Interested in MCP?

Join the MCP community for support and updates.

RedditDiscord