Skip to main content
Glama

Systems MCP

by lethain

systems-mcp

systems-mcp is an MCP server for interacting with the lethain:systems library for systems modeling.

It provides two tools:

  • run_systems_model runs the systems specification of a systems model. Takes two parameters, the specification and, optionally, the number of rounds to run the model (defaulting to 100).
  • load_systems_documentation loads documentation and examples into the context window. This is useful for priming models to be more helpful at writing systems models.

It is intended for running locally in conjunction with Claude Desktop or a similar tool.

Usage

Here's an example of using systems-mcp to run and render a model.

Example of prompt for  using systems-mcp

Here is the artifact generated from that prompt, including the output from running the systems model.

Example of artifact for using the output of systems-mcp

Finally, here is an example of using the load_systems_documentation tool to prime the context window and using it to help generate a systems specification. This is loosely equivalent to including lethain:systems/README.md in the context window, but also includes a handful of additional examples (see the included files in ./docs/.

Example prompt of loading documentation into context window

Then you can render the model as before.

Example prompt of rendering the generated model

The most interesting piece here is that I've never personally used systems to model a social network, but the LLM was able to do a remarkably decent job at generating a specification despite that.

Installation

These instructions describe installation for Claude Desktop on OS X. It should work similarly on other platforms.

  1. Install Claude Desktop.
  2. Clone systems-mcp into a convenient location, I'm assuming /Users/will/systems-mcp
  3. Make sure you have uv installed, you can follow these instructions
  4. Go to Cladue Desktop, Setting, Developer, and have it create your MCP config file. Then you want to update your claude_desktop_config.json. (Note that you should replace will with your user, e.g. the output of whoami.
    cd /Users/will/Library/Application Support/Claude vi claude_desktop_config.json
    Then add this section:
    { "mcpServers": { "systems": { "command": "uv", "args": [ "--directory", "/Users/will/systems-mcp", "run", "main.py" ] } } }
  5. Close Claude and reopen it.
  6. It should work...
Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

An MCP server that allows users to run and visualize systems models using the lethain library, including capabilities to run model specifications and load systems documentation into the context window.

  1. Usage
    1. Installation

      Related MCP Servers

      • -
        security
        A
        license
        -
        quality
        An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.
        Last updated -
        553
        Python
        MIT License
        • Apple
      • -
        security
        F
        license
        -
        quality
        A Filesystem MCP server that allows an LLM to read and list files from a specified directory on your local machine through the Model Context Protocol.
        Last updated -
        2
        Python
      • A
        security
        F
        license
        A
        quality
        An MCP server that allows AI models to execute system commands on local machines or remote hosts via SSH, supporting persistent sessions and environment variables.
        Last updated -
        1
        248
        17
        TypeScript
      • -
        security
        F
        license
        -
        quality
        An MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.
        Last updated -
        2
        Python

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/lethain/systems-mcp'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server