Skip to main content
Glama

Getting Started

We present the Planning Copilot, a chatbot that brings together multiple planning tools and lets users run them using natural language instructions. It’s built on the Model Context Protocol (MCP), which makes it easy for language models to interact with external tools and systems.

The Planning Copilot is modular, so each part can be swapped out, upgraded, or extended without affecting the rest of the system. In the current implementation, Solve uses FastDownward for classical planning and Metric-FF for numeric planning, Verify uses VAL to validate plans, and Execute relies on PDDL_Plus_Parser to simulate and track plan execution.

Dependencies

  1. Make sure that Python 3.10 is installed and active (via virtual environment or conda environment).

  2. Install the latest version of Ollama to run it locally.

  3. Install all project requirements:

python -m pip install -r requirements.txt

Usage

How to use the environment:

  1. Update all the paths and settings in the config.py file.

  2. Run the LLM chat with:

python app.py
  1. To change the LLM, edit the llm_with_tools.py file.

  2. To add new tools, modify the MCP server in solvers_server.py.

Citations

If you find our work interesting or the repo useful, please consider citing this paper:

@article{benyamin2025toward,
  title={Toward PDDL Planning Copilot},
  author={Benyamin, Yarin and Mordoch, Argaman and Shperberg, Shahaf S and Stern, Roni},
  journal={arXiv preprint arXiv:2509.12987},
  year={2025}
}
-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/SPL-BGU/planning-copilot'

If you have feedback or need assistance with the MCP directory API, please join our Discord server