llms.txt•38.8 kB
Directory structure:
└── mindm-mcp/
├── README.md
├── LICENSE
├── pyproject.toml
├── assets/
│ ├── claude_desktop_config.1.json
│ └── claude_desktop_config.2.json
├── examples/
│ └── test_server.py
└── mindm_mcp/
├── __init__.py
└── server.py
================================================
FILE: README.md
================================================
# MindManager MCP Server
A Model Context Protocol (MCP) server implementation for the `mindm` library, providing a standardized interface to interact with MindManager on Windows and macOS.
[&style=for-the-badge&color=blue)](https://pypi.org/project/mindm-mcp/)
[&style=for-the-badge&color=lightblue)](https://pypi.org/project/mindm/)
## Overview
This server allows you to programmatically interact with MindManager through the Model Context Protocol (MCP), a standardized way to provide context and tools to LLMs. It leverages the `mindm` library to manipulate MindManager documents, topics, relationships, and other mindmap elements.
### Animated examples (using Codex in VSCode on Windows)
- Example
Refining a single topic including tags, icons for the new topics:
_get the current Mindmanager mindmap as mermaid (full), refine the given topics at least two levels deep (not just topics), add a tag "Important" to the 3 top most important topics and add corresponding stock icons to every topic but the central topic and create the new MindManager mindmap_

- Example
Refining a single topic including generating text in notes for the new topics:

- Example
Cloning a map including all supported topic properties:

## Other examples
### Client
`get the current MindManager mindmap, translate every topic to German and create a new MindManager mindmap (simple).`
`get the current MindManager mindmap, refine each topic 2 levels and create a new MindManager mindmap (simple).`
### Shell:
`codex exec "get the current MindManager mindmap, add a meaningful emoji to every topic and create new MindManager mindmap (simple)"`
_remark: emojis only work on macOS._
## Features
- Retrieve mindmap structure and central topics
- Export mindmaps to Mermaid, Markdown, JSON formats to be used in LLM chats
- Create MindManager mindmaps directly from Mermaid (full or simplified syntax)
- Fetch MindManager and package versions for quick diagnostics
- Get information about MindManager installation and library folders
- Get current selection from MindManager
## Planned Features
- Add, modify, and manipulate topics and subtopics without Mermaid roundtrips
- Add relationships between topics
- Add tags to topics
- Set document background images
## Requirements
- Python 3.12 or higher
- `mcp` package (Model Context Protocol SDK)
- `mindm` library (included in this project)
- MindManager (supported versions: 23-) installed on Windows or macOS
## Installation macOS
```bash
# Clone the repository (if you're using it from a repository)
git clone https://github.com/robertZaufall/mindm-mcp.git
cd mindm-mcp
# create a virtual environment for Python
brew install uv # if needed
uv pip install -r pyproject.toml
# alternative: manual installation of modules
uv add "mcp[cli]"
uv add fastmcp
uv add markdown-it-py
uv add -U --index-url=https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ mindm mindm-mcp
```
## Installation Windows
```bash
# Change to DOS command prompt
cmd
# Clone the repository (if you're using it from a repository)
git clone https://github.com/robertZaufall/mindm-mcp.git
cd mindm-mcp
# create a virtual environment for Python
pip install uv # if needed
uv pip install -r pyproject.toml
# install nodejs
choco install nodejs # if you have chocolatey installed. If not install nodejs otherwise
refreshenv
node -v
npm install -g npx
```
## Usage
### MCP inspector
```bash
# run mcp with inspector
uv run --with mind --with fastmcp --with markdown-it-py mcp dev mindm_mcp/server.py
```
### Claude Desktop
#### Local python file
Adjust the path for the local file as needed.
```json
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"/Users/master/git/mindm-mcp/mindm_mcp/server.py"
]
}
}
}
```
#### Module from package repository
Adjust `VIRTUAL_ENV` as needed.
```json
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"mindm-mcp>=0.0.2.1",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"-m",
"mindm_mcp.server"
],
"env": {
"VIRTUAL_ENV": "/Users/master/git/mindm-mcp/.venv"
}
}
}
}
```
Hint: If the MCP server does not show up with the hammer icon on Windows, close Claude Desktop and kill all background processes.
### Codex (VSCode Extension / CLI)
#### Local python file
`config.toml` (adjust the path for the local file as needed).
```
[features]
rmcp_client = true
[mcp_servers.mindmanager]
command = "uv"
args = ["run", "--with", "mindm>=0.0.5.3", "--with", "fastmcp", "--with", "markdown-it-py", "/Users/master/git/mindm-mcp/mindm_mcp/server.py"]
```
#### Module from package repository
`config.toml`
```
[features]
rmcp_client = true
[mcp_servers.mindmanager]
command = "uv"
args = ["run", "--with", "mindm>=0.0.5.3", "--with", "fastmcp", "--with", "markdown-it-py", "--with", "mindm-mcp>=0.0.2.1", "-m", "mindm_mcp.server"]
```
### VSCode Chat (GitHub Copilot)
#### Local python file:
Adjust the path for the local file as needed.
```
uv run --with mindm>=0.0.5.3 --with fastmcp --with markdown-it-py /Users/master/git/mindm-mcp/mindm_mcp/server.py
```
or server definition in `mcp.json`:
```
"Mindmanager": {
"type": "stdio",
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"/Users/master/git/mindm-mcp/mindm_mcp/server.py"
]
}
```
#### Module from package repository
```
uv run --with mindm>=0.0.5.3 --with fastmcp --with markdown-it-py --with mindm-mcp>=0.0.2.1 -m mindm_mcp.server
```
or server definition in `mcp.json`:
```
"Mindmanager": {
"type": "stdio",
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"--with",
"mindm-mcp>=0.0.2.1",
"-m",
"mindm_mcp.server"
]
}
```
## MCP Tools
The server exposes the following tools through the Model Context Protocol:
### Document Interaction
- `get_mindmap`: Retrieves the current mindmap structure from MindManager
- `get_selection`: Retrieves the currently selected topics in MindManager
- `get_library_folder`: Gets the path to the MindManager library folder
- `get_mindmanager_version`: Gets the installed MindManager version
- `get_grounding_information`: Extracts grounding information (central topic, selected subtopics) from the mindmap
### Serialization
- `serialize_current_mindmap_to_mermaid`: Serializes the currently loaded mindmap to Mermaid format
- `serialize_current_mindmap_to_markdown`: Serializes the currently loaded mindmap to Markdown format
- `serialize_current_mindmap_to_json`: Serializes the currently loaded mindmap to a detailed JSON object with ID mapping
### Creation
- `create_mindmap_from_mermaid`: Build a MindManager map from Mermaid (full syntax with IDs and metadata)
- `create_mindmap_from_mermaid_simple`: Build a MindManager map from simplified Mermaid text
### Versioning
- `get_versions`: Returns the `mindm-mcp` and `mindm` package versions for debugging
## Platform Support
- **Windows**: Full support for topics, notes, icons, images, tags, links, relationships, and RTF formatting
- **macOS**: Support for topics, notes, and relationships (limited support compared to Windows)
## Integration with Claude and other LLMs
This MCP server can be installed in Claude Desktop or other MCP-compatible applications, allowing LLMs to:
1. Access mindmap content
2. Manipulate mindmaps (coming)
3. Create new mindmaps based on LLM-generated content (coming)
## Troubleshooting
- Ensure MindManager is running before starting the server
- For macOS, make sure you allow Claude Desktop to automate MindManager
## MCPHub
[Certified on MCPHub](https://mcphub.com/mcp-servers/robertZaufall/mindm-mcp)
## Acknowledgements
This project is built upon the `mindm` library, providing Python interfaces to MindManager on Windows and macOS platforms. It uses the Model Context Protocol (MCP) SDK developed by Anthropic.
## License
MIT License - See LICENSE file for details
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) [2025] [Robert Zaufall]
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: pyproject.toml
================================================
[build-system]
requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"
[project]
name = "mindm_mcp"
version = "0.0.2.3"
authors = [
{ name="Robert Zaufall" },
]
description = "Model Context Protocol (MCP) server for the mindm library, enabling AI assistants like Claude to interact with MindManager."
readme = "README.md"
requires-python = ">=3.12"
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: Microsoft :: Windows",
"Operating System :: MacOS :: MacOS X",
]
dependencies = [
"mindm>=0.0.5.3",
"uvicorn>=0.22.0",
"aiohttp>=3.8.4",
"pydantic>=1.10.7",
"websockets>=10.4",
"fastmcp>=0.4.1",
"mcp[cli]>=1.6.0",
"build>=1.2.2.post1",
]
[project.urls]
"Homepage" = "https://github.com/robertZaufall/mindm-mcp"
"Bug Tracker" = "https://github.com/robertZaufall/mindm-mcp/issues"
[project.scripts]
mindm-mcp = "mindm_mcp.server:main"
[tool.setuptools]
packages = ["mindm_mcp"]
================================================
FILE: assets/claude_desktop_config.1.json
================================================
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"mindm-mcp>=0.0.2.0",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"-m",
"mindm_mcp.server"
],
"env": {
"VIRTUAL_ENV": "/Users/master/git/mindm-mcp/.venv"
}
}
}
}
================================================
FILE: assets/claude_desktop_config.2.json
================================================
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.5.3",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"/Users/master/git/mindm-mcp/mindm_mcp/server.py"
]
}
}
}
================================================
FILE: examples/test_server.py
================================================
import asyncio
import json
import sys
import os
from typing import Any, Dict, List, Union
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
import mindm_mcp.server as server
MERMAID_FULL_EXAMPLE = """
mindmap
[Creating an AI startup] %% {"id": 1}
[Vision & Strategy] %% {"id": 2}
[Mission and Value] %% {"id": 3}
[Problem statement] %% {"id": 4}
[Value proposition] %% {"id": 5}
[Long term goals] %% {"id": 6}
[Competitive Positioning] %% {"id": 7}
[Differentiation pillars] %% {"id": 8}
[Key competitors map] %% {"id": 9}
[Barrier strategies] %% {"id": 10}
[Product & Tech] %% {"id": 19}
[MVP Design] %% {"id": 20}
[Core feature set] %% {"id": 21}
[User flows map] %% {"id": 22}
[Rapid prototyping] %% {"id": 23}
"""
MERMAID_SIMPLE_EXAMPLE = """
mindmap
Creating an AI startup
Vision & Strategy
Mission and Value
Problem statement
Value proposition
Long term goals
Competitive Positioning
Differentiation pillars
Key competitors map
Barrier strategies
Product & Tech
MVP Design
Core feature set
User flows map
Rapid prototyping
"""
async def call_get_mindmap():
"""Calls server.get_mindmap with different parameters."""
print("\n--- Testing get_mindmap ---")
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling get_mindmap(mode='{mode}', turbo_mode={turbo_mode})")
result = await server.get_mindmap(mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_get_selection():
"""Calls server.get_selection with different parameters."""
print("\n--- Testing get_selection ---")
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling get_selection(mode='{mode}', turbo_mode={turbo_mode})")
result = await server.get_selection(mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_get_library_folder():
"""Calls server.get_library_folder."""
print("\n--- Testing get_library_folder ---")
print("Calling get_library_folder()")
result = await server.get_library_folder()
print(f"Result: {json.dumps(result, indent=2)}")
async def call_get_grounding_information():
"""Calls server.get_grounding_information with different parameters."""
print("\n--- Testing get_grounding_information ---")
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling get_grounding_information(mode='{mode}', turbo_mode={turbo_mode})")
result = await server.get_grounding_information(mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_serialize_current_mindmap_to_mermaid():
"""Calls server.serialize_current_mindmap_to_mermaid with different parameters."""
print("\n--- Testing serialize_current_mindmap_to_mermaid ---")
id_only_options = [True, False]
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for id_only in id_only_options:
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling serialize_current_mindmap_to_mermaid(id_only={id_only}, mode='{mode}', turbo_mode={turbo_mode})")
result = await server.serialize_current_mindmap_to_mermaid(id_only=id_only, mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_serialize_current_mindmap_to_markdown():
"""Calls server.serialize_current_mindmap_to_markdown with different parameters."""
print("\n--- Testing serialize_current_mindmap_to_markdown ---")
include_notes_options = [True, False]
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for include_notes in include_notes_options:
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling serialize_current_mindmap_to_markdown(include_notes={include_notes}, mode='{mode}', turbo_mode={turbo_mode})")
result = await server.serialize_current_mindmap_to_markdown(include_notes=include_notes, mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_serialize_current_mindmap_to_json():
"""Calls server.serialize_current_mindmap_to_json with different parameters."""
print("\n--- Testing serialize_current_mindmap_to_json ---")
ignore_rtf_options = [True, False]
modes = ['full', 'content', 'text']
turbo_modes = [True, False]
for ignore_rtf in ignore_rtf_options:
for mode in modes:
for turbo_mode in turbo_modes:
print(f"Calling serialize_current_mindmap_to_json(ignore_rtf={ignore_rtf}, mode='{mode}', turbo_mode={turbo_mode})")
result = await server.serialize_current_mindmap_to_json(ignore_rtf=ignore_rtf, mode=mode, turbo_mode=turbo_mode)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_create_mindmap_from_mermaid():
"""Calls server.create_mindmap_from_mermaid with an example diagram."""
print("\n--- Testing create_mindmap_from_mermaid ---")
print("Calling create_mindmap_from_mermaid(turbo_mode=True)")
result = await server.create_mindmap_from_mermaid(
mermaid=MERMAID_FULL_EXAMPLE,
turbo_mode=True,
)
print(f"Result: {json.dumps(result, indent=2)}")
async def call_create_mindmap_from_mermaid_simple():
"""Calls server.create_mindmap_from_mermaid_simple with an example diagram."""
print("\n--- Testing create_mindmap_from_mermaid_simple ---")
print("Calling create_mindmap_from_mermaid_simple(turbo_mode=True)")
result = await server.create_mindmap_from_mermaid_simple(
mermaid=MERMAID_SIMPLE_EXAMPLE,
turbo_mode=True,
)
print(f"Result: {json.dumps(result, indent=2)}")
async def main():
"""Calls all the test functions."""
await call_get_mindmap()
await call_get_selection()
await call_get_library_folder()
await call_get_grounding_information()
await call_serialize_current_mindmap_to_mermaid()
await call_serialize_current_mindmap_to_markdown()
await call_serialize_current_mindmap_to_json()
await call_create_mindmap_from_mermaid()
await call_create_mindmap_from_mermaid_simple()
if __name__ == "__main__":
# Check if MindManager is running before running the tests
try:
import mindm.mindmanager as mm
server._get_library_folder()
print("MindManager is running. Proceeding with tests.")
asyncio.run(main())
except Exception as e:
print(f"Error: MindManager is not running or an error occurred: {e}")
print("Please ensure MindManager is running and try again.")
sys.exit(1)
================================================
FILE: mindm_mcp/__init__.py
================================================
[Empty file]
================================================
FILE: mindm_mcp/server.py
================================================
#!/usr/bin/env python3
"""
server.py - FastMCP implementation for the mindm library
This module implements a Model Context Protocol (MCP) server
for interacting with MindManager through the mindm library using FastMCP.
"""
import sys
from typing import Dict, Any, List, Optional, Union
from contextlib import asynccontextmanager
from collections.abc import AsyncIterator
from dataclasses import dataclass
import asyncio
import json
from mcp.server.fastmcp import FastMCP, Context
from mindmap.mindmap import MindmapDocument, MindmapTopic
from mindmap import serialization, helpers
import mindm
from mindm import mindmanager as mm
try:
from importlib.metadata import version as _version
__version__ = _version("mindm_mcp")
except ImportError:
__version__ = "unknown"
# --- Globals ---
SERVER_NAME = "mindm (MindManager)"
def _create_mcp_server() -> FastMCP:
"""
Instantiate FastMCP while remaining compatible with versions of the MCP
package that don't support the `version` keyword argument.
"""
try:
return FastMCP(SERVER_NAME, version=__version__)
except TypeError as exc:
if "unexpected keyword argument 'version'" not in str(exc):
raise
print(
"FastMCP does not accept a 'version' argument; continuing without it.",
file=sys.stderr,
)
return FastMCP(SERVER_NAME)
# Initialize the MCP server (handles FastMCP compatibility automatically)
mcp = _create_mcp_server()
doc_lock = asyncio.Lock()
# --- Helper Functions ---
def _serialize_result(data: Any) -> Union[Dict, List, str, int, float, bool, None]:
"""Helper to serialize results, especially MindmapTopic structures."""
if isinstance(data, (MindmapTopic, list)):
# Use simple serialization for MCP results unless full detail is needed
return serialization.serialize_object_simple(data)
elif isinstance(data, tuple):
# Tuples are often JSON serializable directly if elements are
return list(data) # Convert to list for guaranteed JSON compatibility
elif isinstance(data, (dict, str, int, float, bool, type(None))):
return data
else:
# Attempt string conversion for unknown types
print(f"Warning: Serializing unknown type {type(data)} as string.", file=sys.stderr)
return str(data)
def _handle_mindmanager_error(func_name: str, e: Exception) -> Dict[str, str]:
"""Formats MindManager errors for MCP response."""
error_message = f"Error during MindManager operation '{func_name}': {e}"
print(f"ERROR: {error_message}", file=sys.stderr)
# Check for specific known errors from mindm.mindmanager if possible
if "No document found" in str(e):
return {"error": "MindManager Error", "message": "No document found or MindManager not running."}
# Add more specific error checks here based on mindm library
return {"error": "MindManager Error", "message": f"An error occurred: {e}"}
# --- Internal functions ---
MACOS_ACCESS_METHOD = 'applescript' # appscript is not working with MCPs
def _get_document_instance(
charttype: str = 'auto',
turbo_mode: bool = False,
inline_editing_mode: bool = False,
mermaid_mode: bool = True,
macos_access: str = MACOS_ACCESS_METHOD
) -> MindmapDocument:
document = MindmapDocument(
charttype=charttype,
turbo_mode=turbo_mode,
inline_editing_mode=inline_editing_mode,
mermaid_mode=mermaid_mode,
macos_access=macos_access
)
return document
def _get_selection(mode='content', turbo_mode=False):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
selection = document.get_selection()
return selection
return None
def _get_grounding_information(mode='text', turbo_mode=True):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
document.get_selection()
return document.get_grounding_information()
return None
def _get_mindmap_content(mode='content', turbo_mode=False):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
return document.mindmap
return None
def _serialize_mermaid(id_only=True, mode='content', turbo_mode=False):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
guid_mapping = {}
serialization.build_mapping(document.mindmap, guid_mapping)
mermaid = serialization.serialize_mindmap(document.mindmap, guid_mapping, id_only=id_only)
return mermaid
return None
def _deserialize_mermaid(mermaid="", turbo_mode=True):
guid_mapping = {}
deserialized = serialization.deserialize_mermaid_full(mermaid, guid_mapping)
document = _get_document_instance(turbo_mode=turbo_mode)
document.mindmap = deserialized
document.create_mindmap()
return None
def _deserialize_mermaid_simple(mermaid="", turbo_mode=True):
deserialized = serialization.deserialize_mermaid_simple(mermaid)
document = _get_document_instance(turbo_mode=turbo_mode)
document.mindmap = deserialized
document.create_mindmap()
return None
def _serialize_markdown(include_notes=True, mode='content', turbo_mode=False):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
markdown = serialization.serialize_mindmap_markdown(document.mindmap, include_notes=include_notes)
return markdown
return None
def _serialize_json(ignore_rtf=True, mode='content', turbo_mode=False):
document = _get_document_instance(turbo_mode=turbo_mode)
if document.get_mindmap(mode=mode):
json_obj = serialization.serialize_object_simple(document.mindmap, ignore_rtf=ignore_rtf)
return json_obj
return None
def _get_library_folder():
mindmanager_obj = mm.Mindmanager()
library_folder = mindmanager_obj.get_library_folder()
return library_folder
# --- MCP Tools ---
# == MindmapDocument Methods ==
@mcp.tool()
async def get_mindmap(
mode: str = 'full',
turbo_mode: bool = False
) -> Dict[str, Any]:
"""
Retrieves the current mind map structure from MindManager.
Args:
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Dict[str, Any]: Serialized mind map structure or error dictionary.
"""
try:
print(f"Calling get_mindmap(mode={mode}, turbo_mode={turbo_mode})", file=sys.stderr)
mindmap = _get_mindmap_content(mode=mode, turbo_mode=turbo_mode)
print("get_mindmap successful, returning serialized mindmap.", file=sys.stderr)
return _serialize_result(mindmap)
except Exception as e:
return _handle_mindmanager_error("get_mindmap", e)
@mcp.tool()
async def get_selection(
mode: str = 'full',
turbo_mode: bool = False
) -> Union[List[Dict[str, Any]], Dict[str, str]]:
"""
Retrieves the currently selected topics in MindManager.
Args:
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Union[List[Dict[str, Any]], Dict[str, str]]: List of serialized selected topics or error dictionary.
"""
try:
print(f"Calling get_selection(mode={mode}, turbo_mode={turbo_mode})", file=sys.stderr)
selection = _get_selection(mode=mode, turbo_mode=turbo_mode)
print("get_selection successful, returning serialized selection.", file=sys.stderr)
return _serialize_result(selection)
except Exception as e:
return _handle_mindmanager_error("get_selection", e)
@mcp.tool()
async def get_library_folder(
) -> Union[str, Dict[str, str]]:
"""
Gets the path to the MindManager library folder.
Returns:
Union[str, Dict[str, str]]: The library folder path or error dictionary.
"""
try:
folder_path = _get_library_folder()
print(f"get_library_folder() returned: {folder_path}", file=sys.stderr)
return folder_path
except Exception as e:
return _handle_mindmanager_error("get_library_folder", e)
@mcp.tool()
async def get_mindmanager_version(
) -> Union[str, Dict[str, str]]:
"""
Gets the version of the MindManager application.
Returns:
Union[str, Dict[str, str]]: The version of the MindManager application or error dictionary.
"""
try:
version = mm.Mindmanager().get_version()
print(f"get_mindmanager_version() returned: {version}", file=sys.stderr)
return version
except Exception as e:
return _handle_mindmanager_error("get_mindmanager_version", e)
@mcp.tool()
async def get_grounding_information(
mode: str = 'full',
turbo_mode: bool = False
) -> Union[List[str], Dict[str, str]]:
"""
Extracts grounding information (central topic, selected subtopics) from the mindmap.
Args:
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Union[List[str], Dict[str, str]]: A list containing [top_most_topic, subtopics_string] or error dictionary.
"""
try:
print("Calling get_grounding_information()", file=sys.stderr)
top_most, subtopics_str = _get_grounding_information(mode=mode, turbo_mode=turbo_mode)
print(f"get_grounding_information() returned: top='{top_most}', subtopics='{subtopics_str}'", file=sys.stderr)
return [top_most, subtopics_str] # Return as list for JSON
except Exception as e:
# This function doesn't directly call MindManager, so errors are less likely external
print(f"ERROR in get_grounding_information: {e}", file=sys.stderr)
return {"error": "Internal Error", "message": f"Failed to get grounding information: {e}"}
# == Serialization Methods (Operating on current in-memory mindmap) ==
@mcp.tool()
async def serialize_current_mindmap_to_mermaid(
id_only: bool = False,
mode: str = 'full',
turbo_mode: bool = False
) -> Union[str, Dict[str, str]]:
"""
Serializes the currently loaded mindmap to Mermaid format.
Args:
id_only (bool): If True, only include IDs without detailed attributes. Defaults to False.
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Union[str, Dict[str, str]]: Mermaid formatted string or error dictionary.
"""
try:
print(f"Serializing current mindmap to Mermaid (id_only={id_only}).", file=sys.stderr)
text = _serialize_mermaid(id_only=id_only, mode=mode, turbo_mode=turbo_mode)
print("Serialization to Mermaid successful.", file=sys.stderr)
return text
except Exception as e:
print(f"ERROR during serialization to Mermaid: {e}", file=sys.stderr)
return {"error": "Serialization Error", "message": f"Failed to serialize to Mermaid: {e}"}
@mcp.tool()
async def serialize_current_mindmap_to_markdown(
include_notes: bool = True,
mode: str = 'content',
turbo_mode: bool = False
) -> Union[str, Dict[str, str]]:
"""
Serializes the currently loaded mindmap to Markdown format.
Args:
include_notes (bool): If True, include notes in the serialization. Defaults to True.
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'content'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Union[str, Dict[str, str]]: Markdown formatted string or error dictionary.
"""
try:
print(f"Serializing current mindmap to Markdown.", file=sys.stderr)
text = _serialize_markdown(include_notes=include_notes, mode=mode, turbo_mode=turbo_mode)
print("Serialization to Markdown successful.", file=sys.stderr)
return text
except Exception as e:
print(f"ERROR during serialization to Markdown: {e}", file=sys.stderr)
return {"error": "Serialization Error", "message": f"Failed to serialize to Markdown: {e}"}
@mcp.tool()
async def serialize_current_mindmap_to_json(
ignore_rtf: bool = True,
mode: str = 'full',
turbo_mode: bool = True
) -> Union[Dict[str, Any], Dict[str, str]]:
"""
Serializes the currently loaded mindmap to a detailed JSON object with ID mapping.
Args:
ignore_rtf (bool): Whether to ignore RTF content. Defaults to True.
mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'.
turbo_mode (bool): Enable turbo mode (text only). Defaults to False.
Returns:
Union[Dict[str, Any], Dict[str, str]]: JSON serializable dictionary or error dictionary.
"""
try:
print(f"Serializing current mindmap to detailed JSON (ignore_rtf={ignore_rtf}).", file=sys.stderr)
json_obj = _serialize_json(ignore_rtf=ignore_rtf, mode=mode, turbo_mode=turbo_mode)
print("Serialization to detailed JSON successful.", file=sys.stderr)
return json_obj
except Exception as e:
print(f"ERROR during serialization to JSON: {e}", file=sys.stderr)
return {"error": "Serialization Error", "message": f"Failed to serialize to JSON: {e}"}
# == Deserialization Methods (Applying Mermaid to MindManager) ==
@mcp.tool()
async def create_mindmap_from_mermaid(
mermaid: str
) -> Dict[str, str]:
"""
Deserializes a Mermaid mindmap and creates a MindManager mindmap from it (caller must follow the guidance).
Args:
mermaid (str): Mermaid text describing the desired mindmap with supported topic metadata, e.g. `[Topic] %% {"id": n, "notes": {"text": "Notes"}, "links": [{"text": "label", "url": "https://example.com"}], "references": [{"id_1": i, "id_2": j, "direction": 1}], "image": {"text": "C:\\path\\to\\image.png"}, "icons": [{"text": "StockIcon-36", "is_stock_icon": true, "index": 36}], "tags": ["tag1"]}`
Guidance for callers constructing `mermaid`:
- Every line must be syntactically correct Mermaid code and contain at least a topic label, e.g. `[Topic]`.
- For the root topic just use the label, e.g. `[Central Topic]`
- Full syntax supports attaching metadata via JSON after `%%` on the same line, e.g.
`[Topic] %% {"id": n, "notes": {"text": "Notes"}, "links": [{"text": "label", "url": "https://example.com"}], "references": [{"id_1": i, "id_2": j, "direction": 1}], "image": {"text": "C:\\path\\to\\image.png"}, "icons": [{"text": "StockIcon-36", "is_stock_icon": true, "index": m}], "tags": ["tag1"]}`
- For icons, use `icons`: `[{"text": "StockIcon-<index>", "is_stock_icon": true, "index": <index>}]` where available options for stock icons are: Arrow Down(66), Arrow Left(65), Arrow Right(37), Arrow Up(36), Bomb(51), Book(67), Broken Connection(69), Calendar(8), Camera(41), Cellphone(40), Check(62), Clock(7), Coffee Cup(59), Dollar(15), Email(10), Emergency(49), Euro(16), Exclamation Mark(44), Fax(42), Flag Black(20), Flag Blue(18), Flag Green(19), Flag Orange(21), Flag Purple(23), Flag Red(17), Flag Yellow(22), Folder(71), Glasses(53), Hourglass(48), House(13), Information(70), Judge Hammer(54), Key(52), Letter(9), Lightbulb(58), Magnifying Glass(68), Mailbox(11), Marker 1(25), Marker 2(26), Marker 3(27), Marker 4(28), Marker 5(29), Marker 6(30), Marker 7(31), Meeting(61), Megaphone(12), No Entry(50), Note(63), On Hold(47), Padlock Locked(34), Padlock Unlocked(35), Phone(39), Question Mark(45), Redo(57), Resource 1(32), Resource 2(33), Rocket(55), Rolodex(14), Scales(56), Smiley Angry(5), Smiley Happy(2), Smiley Neutral(3), Smiley Sad(4), Smiley Screaming(6), Stop(43), Thumbs Down(64), Thumbs Up(46), Traffic Lights Red(24), Two End Arrow(38), Two Feet(60).
Returns:
Dict[str, str]: Status dictionary indicating success or error details.
"""
if not mermaid or not mermaid.strip():
return {"error": "Invalid Input", "message": "Mermaid content is required."}
try:
print("Creating mindmap from Mermaid diagram (full).", file=sys.stderr)
_deserialize_mermaid(mermaid=mermaid, turbo_mode=False)
print("Mindmap created from Mermaid diagram.", file=sys.stderr)
ret_val = {"status": "success", "message": "Mindmap created from Mermaid diagram."}
return ret_val
except Exception as e:
return _handle_mindmanager_error("create_mindmap_from_mermaid", e)
@mcp.tool()
async def create_mindmap_from_mermaid_simple(
mermaid: str,
turbo_mode: bool = True
) -> Dict[str, str]:
"""
Deserializes a Mermaid mindmap in simplified syntax and creates a MindManager mindmap from it.
Args:
mermaid (str): Mermaid text describing the desired mindmap.
turbo_mode (bool): Enable turbo mode (text-only operations). Defaults to True.
Returns:
Dict[str, str]: Status dictionary indicating success or error details.
"""
if not mermaid or not mermaid.strip():
return {"error": "Invalid Input", "message": "Mermaid content is required."}
try:
print("Creating mindmap from Mermaid diagram (simple).", file=sys.stderr)
_deserialize_mermaid_simple(mermaid=mermaid, turbo_mode=turbo_mode)
print("Mindmap created from Mermaid diagram (simple).", file=sys.stderr)
return {"status": "success", "message": "Mindmap created from Mermaid diagram (simple)."}
except Exception as e:
return _handle_mindmanager_error("create_mindmap_from_mermaid_simple", e)
@mcp.tool()
async def get_versions() -> Dict[str, str]:
"""
Get the versions of the MindManager Automation MCP Server components.
Returns:
Dict[str, str]: A dictionary containing the versions of the components.
"""
result = {}
result["mindm-mcp"] = __version__
result["mindm"] = mindm.__version__
return result
def main():
print("Starting MindManager Automation MCP Server...", file=sys.stderr)
try:
mcp.run(transport='stdio')
except Exception as main_e:
print(f"FATAL: Server crashed: {main_e}", file=sys.stderr)
sys.exit(1)
finally:
print("MindManager Automation MCP Server stopped.", file=sys.stderr)
# --- Main Execution ---
if __name__ == "__main__":
main()