Skip to main content
Glama

basic mcp

The goal of this is just to write a simple MCP server with a handful of tools that can be plugged in to a local LLM.

It's just for experimentation but also straightforward enough to be human-readable (for those trying to learn)

Learning

In src/basic_mcp/main.py you see how straightforward it is to make an MCP server with FastMCP. It's just calling something, and the pandoc comments are what are given to the LLM.

Note that MCP servers are (in some way) just polite suggestions of text to LLMs, so without the comments here it actually won't work (or will work many orders of magnitude worse).

In src/basic_mcp/tools/web_tools.py there's a simple 'fetch article' tool. This is just so I have a placeholder/structure for extra tools as they're needed

Related MCP server: FastMCP Boilerplate

install

As with everything, you should set this up in a virtualenv.

Then it's just a pip install -e .

running

cd src/basic_mcp && python main.py

uvx

This can be called with uvx --from /path/to/basic-mcp/ basic-mcp if it needs to be launched that way (for example with Jan)

Also note/remember that if it's in a virtualenv, uvx should be called with the full path name from within the virtualenv's bin.

Output

Note that the output is stdio and not a web server. This can be changed simply enough inside the main function and removing the transport variable parameter. Then it will default to being a web server

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nimishgautam/basic-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server