Skip to main content
Glama

MCP Proof of Concept

This repository contains a simple Model Context Protocol (MCP) server implemented with FastAPI. The goal is to expose herd data through a discoverable, versioned API that can be deployed to AWS Fargate.

Running locally

  1. Install dependencies:

    The database path can be configured via the DATABASE_PATH environment variable. If not set it defaults to mcp.db inside the working directory.

    pip install -r requirements.txt
  2. Seed the SQLite database:

    python -m app.seed
  3. Start the API server:

    uvicorn app.main:app --reload
  4. Authenticate with the token fake-super-secret-token when calling the API.

The MCP discovery file is available at model_context.yaml.

Related MCP server: MyAIServ MCP Server

Using the agent

An agent package is provided to interact with the MCP server. After the server is running you can list the herd data like so:

python -m agent http://localhost:8000 --token fake-super-secret-token

The agent reads model_context.yaml to discover the API path and returns the JSON response from the server. For full YAML support install the optional PyYAML dependency; otherwise a limited built-in parser is used.

Running tests

pytest -q

Container

A Dockerfile is provided to run the server in a container. Build with:

docker build -t mcp .

Terraform

The terraform directory contains a minimal configuration showing how the container could be deployed to AWS (e.g. Fargate). It creates an ECR repository for the image.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DPoitrast/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server