Skip to main content
Glama

Agents Library MCP Server

MCP Server

This is a Python project designed to serve as an MCP (Multi-Cloud Platform) server. It utilizes FastAPI for the web framework and Uvicorn as the ASGI server. The project also includes an agents-library for managing agent-related rules and prompts.

: Technologies Used

  • Python
  • FastAPI: Web framework for building APIs.
  • Uvicorn: ASGI server.
  • mcp: Multi-Cloud Platform SDK.

: Project Structure

  • Dockerfile: Used for containerizing the application.
  • compose.yaml: Used for running the application with Docker Compose.
  • requirements.txt: Lists Python dependencies.
  • app/:
    • server.py: The main application server.
  • agents-library/:
    • dev_rules.agents.md: Development-related agent rules.
    • security_checks.agents.md: Security-related agent checks.
    • common_prompts.agents.md: Common prompts for agents.

: Getting Started

To get started with this project, you need to have Python 3.10+ and Docker installed on your system.

Prerequisites

  • Python 3.10+
  • Docker
  • pip

Installation

  1. Create and Activate Virtual Environment:
    python3 -m venv venv source venv/bin/activate
  2. Install Dependencies:
    pip install -r requirements.txt

: Building and Running

You can run the server using Docker Compose or directly with Uvicorn.

Using Docker Compose

To run the server with Docker Compose, use the following command:

docker compose up

Using Uvicorn

To run the server with Uvicorn, use the following command:

uvicorn app.server:app --host 0.0.0.0 --port 8080

: Development Conventions

  • Virtual Environments: Always use a virtual environment for dependency management.
  • Dependencies: All Python dependencies should be listed in requirements.txt.

: API Endpoints

The following API endpoints are available:

  • POST /test/call_tool: Test endpoint for direct tool invocation.
  • POST /test/read_resource: Test endpoint for direct resource invocation.

The MCP server also exposes the following tools:

  • get_agents_instructions: Retrieves a specific AGENTS.md file for providing AI with instructions and context.
  • list_agents_instructions: Lists all available AGENTS.md files.

: Adding to gemini-cli

To add this server to gemini-cli, you need to edit your settings.json file. You can find this file in ~/.gemini/settings.json (user settings) or in .gemini/settings.json (project settings).

Add the following to your settings.json file:

{ "mcpServers": { "httpServer": { "httpUrl": "http://<ip-address>:8080" } } }

Using the mcp tool

Once the mcp-server is configured in gemini-cli, you can use the mcp tool to interact with the server. For example, to list all available agent instructions:

gemini mcp list_agents_instructions

To retrieve a specific agent instruction file:

gemini mcp get_agents_instructions --file_name dev_rules.agents.md

See reference.

: License

This project is licensed under the Apache License 2.0.

: Author

This project was started in 2025 by Nicholas Wilde.

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables access to agent instruction files and prompts for AI development workflows. Provides tools to retrieve and list development rules, security checks, and common prompts from an agents library through MCP protocol.

  1. :rocket: Technologies Used
    1. :open_file_folder: Project Structure
      1. :checkered_flag: Getting Started
        1. Prerequisites
        2. Installation
      2. :hammer_and_wrench: Building and Running
        1. Using Docker Compose
        2. Using Uvicorn
      3. :scroll: Development Conventions
        1. :electric_plug: API Endpoints
          1. :gemini: Adding to gemini-cli
            1. Using the mcp tool
          2. :balance_scale: License
            1. :pencil: Author

              Related MCP Servers

              • -
                security
                A
                license
                -
                quality
                An MCP server that analyzes codebases and generates contextual prompts, making it easier for AI assistants to understand and work with code repositories.
                Last updated -
                12
                MIT License
              • A
                security
                A
                license
                A
                quality
                A foundation for building custom local Model Context Protocol (MCP) servers that provide tools accessible to AI assistants like Cursor or Claude Desktop.
                Last updated -
                1
                28
                MIT License
              • A
                security
                A
                license
                A
                quality
                Enables AI assistants to discover, retrieve details about, and manage MCP (Model Context Protocol) servers that provide additional tools and capabilities on demand.
                Last updated -
                5
                306
                6
                AGPL 3.0
                • Linux
                • Apple
              • A
                security
                A
                license
                A
                quality
                A MCP server that enables human-in-the-loop workflow in AI-assisted development tools by allowing users to run commands, view their output, and provide textual feedback directly to the AI assistant.
                Last updated -
                1
                1,517
                MIT License
                • Linux
                • Apple

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/nicholaswilde/mcp-server'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server