Skip to main content
Glama

Physics MCP Server

by BlinkZer0

Physics MCP Server

Home · Docs · Architecture · Configuration · Tools: CAS · Plot · NLI · Report · Tensor · Quantum · StatMech

A specialized MCP (Model Context Protocol) server for physicists, providing Computer Algebra System (CAS), plotting, and natural language interface capabilities.

Features

Phase 1-3 (Current)

  • CAS tools: evaluate, differentiate, integrate, solve equations/ODEs with optional units

  • Plot tools: 2D functions, parametric curves, 2D vector fields, 3D surfaces, contours, phase portraits (PNG + CSV + SVG)

  • NLI tool: parse natural language into structured tool calls (LM Studio compatible)

  • Units & Constants: Pint-based unit conversion, CODATA and astrophysical constants

  • Report tool: generate Markdown reports from persisted sessions

  • Tensor algebra: Christoffel symbols, curvature tensors, geodesics

  • Quantum mechanics: operator algebra, standard problems (SHO, particle in box), Bloch sphere visualization

  • Statistical mechanics: partition functions, thermodynamic quantities

  • Device acceleration: Optional GPU acceleration via PyTorch with CPU fallback

Architecture

phys-mcp/ ├── packages/ │ ├── server/ # TypeScript MCP server │ ├── tools-cas/ # CAS tool adapters │ ├── tools-plot/ # Plotting tool adapters │ ├── tools-nli/ # NLI parser │ ├── tools-units/ # Unit conversion tools │ ├── tools-constants/ # Physical constants │ ├── tools-report/ # Report generation │ ├── tools-tensor/ # Tensor algebra (Phase 3) │ ├── tools-quantum/ # Quantum mechanics (Phase 3) │ ├── tools-statmech/ # Statistical mechanics (Phase 3) │ └── python-worker/ # Python computation backend ├── examples/ │ └── requests/ # Example JSON-RPC requests ├── scripts/ # Dev/format/build helpers └── mcp_config.json # MCP server configuration

Quick Start

Prerequisites

  • Node.js 20+

  • Python 3.11+

  • pnpm 8+

Optional (recommended for faster NLI):

  • LM Studio or any OpenAI-compatible local LM server

Installation

# Install dependencies pnpm install # Install Python dependencies cd packages/python-worker pip install -r requirements.txt cd ../.. # Build all packages pnpm build # Start development server pnpm dev

Configuration

  • Environment variables used by NLI: LM_BASE_URL, LM_API_KEY (optional), DEFAULT_MODEL

  • See mcp_config.json for a working example of server + env configuration

  • Add the server to your MCP client configuration

See docs/Configuration for details.

Optional: Faster NLI with LM Studio

LM Studio is not required. All CAS/plot/tensor/quantum/stat-mech calculations run in TypeScript/Python workers and work out of the box. Configuring a local LM endpoint such as LM Studio only accelerates the Natural Language Interface (NLI) that turns plain English into structured tool calls.

Why it helps

  • Lower latency: local inference avoids network round-trips and rate limits.

  • GPU utilization: LM Studio can use your GPU to speed up prompt parsing.

  • Better parsing on complex requests: higher-quality intent extraction reduces retries before calculations begin.

  • Privacy & cost: keep tokens local; no external API keys required.

How it speeds up “calculations” end-to-end

  • The math is computed by our Python/TS backends; the LM is used to decide “what to compute.” Faster parsing → fewer back-and-forths → quicker CAS/plot calls → faster overall results.

How to enable

  • Install and run LM Studio (or any OpenAI-compatible local server).

  • Set LM_BASE_URL (e.g., http://localhost:1234/v1) and DEFAULT_MODEL.

  • Optionally set LM_API_KEY if your local server requires it.

Example Usage

// Differentiate an expression { "jsonrpc": "2.0", "id": "1", "method": "cas_diff", "params": { "expr": "sin(x**2)", "symbol": "x" } } // Evaluate with units { "jsonrpc": "2.0", "id": "2", "method": "cas_evaluate", "params": { "expr": "(1/2)*m*v**2", "vars": { "m": {"value": 1.0, "unit": "kg"}, "v": {"value": 3.0, "unit": "m/s"} } } } // Plot a function { "jsonrpc": "2.0", "id": "3", "method": "plot_function_2d", "params": { "f": "sin(x)", "x_min": 0, "x_max": 6.28318, "dpi": 160 } } // Natural language parsing { "jsonrpc": "2.0", "id": "4", "method": "nli_parse", "params": { "text": "Solve y'' + y = 0 with y(0)=0 and y'(0)=1" } }

Development

Building

pnpm build

Linting & Formatting

pnpm lint pnpm format

Testing

pnpm test pnpm run test:install

Documentation

  • Docs index: docs/README.md

  • Architecture: docs/Architecture.md

  • Configuration: docs/Configuration.md

  • Tools:

    • CAS: docs/Tools/CAS.md

    • Plot: docs/Tools/Plot.md

    • NLI: docs/Tools/NLI.md

    • Report: docs/Tools/Report.md

    • Tensor: docs/Tools/Tensor.md

    • Quantum: docs/Tools/Quantum.md

    • StatMech: docs/Tools/StatMech.md

  • Examples: examples/requests/

Side note: We conserve clarity and momentum—any dispersion is purely numerical.

Roadmap

Phase 2+: tensor calculus (sympy.diffgeom), quantum ops (qutip), 3D rendering, PDE/FEM, scientific data I/O, LaTeX/PDF reporting.

License

MIT License - see LICENSE file for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables physicists to perform computer algebra calculations, create scientific plots, solve differential equations, work with tensor algebra and quantum mechanics, and parse natural language physics problems. Supports unit conversion, physical constants, and generates comprehensive reports with optional GPU acceleration.

  1. Features
    1. Phase 1-3 (Current)
  2. Architecture
    1. Quick Start
      1. Prerequisites
      2. Installation
      3. Configuration
      4. Optional: Faster NLI with LM Studio
      5. Example Usage
    2. Development
      1. Building
      2. Linting & Formatting
      3. Testing
    3. Documentation
      1. Roadmap
        1. License

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/BlinkZer0/Phys-MCP'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server