Uses Docker to run the Qdrant vector database which stores the persistent memories created by the model.
Provides project sponsorship capabilities through GitHub Sponsors.
Offers support for the project through Ko-fi donations.
Uses YAML configuration files to define custom cognitive frameworks and memory structures for the model.
Fegis
Fegis does 3 things:
- Easy to write tools - Write prompts in YAML format. Tool schemas use flexible natural language instructions.
- Structured data from tool calls saved in a vector database - Every tool use is automatically stored in Qdrant with full context.
- Search - AI can search through all previous tool usage using semantic similarity, filters, or direct lookup.
Quick Start
Configure Claude Desktop
Update claude_desktop_config.json
:
Restart Claude Desktop. You'll have 7 new tools available including SearchMemory.
How It Works
1. Tools from YAML
2. Automatic Memory Storage
Every tool invocation gets stored with:
- Tool name and parameters used
- Complete input and output
- Timestamp and session context
- Vector embeddings for semantic search
3. SearchMemory Tool
Available Archetypes
archetypes/default.yaml
- Cognitive analysis tools (UncertaintyNavigator, BiasDetector, etc.)archetypes/simple_example.yaml
- Basic example toolsarchetypes/emoji_mind.yaml
- Symbolic reasoning with emojisarchetypes/slime_mold.yaml
- Network optimization toolsarchetypes/vibe_surfer.yaml
- Web exploration tools
Configuration
Required environment variables:
ARCHETYPE_PATH
- Path to YAML archetype fileQDRANT_URL
- Qdrant database URL (default: http://localhost:6333)
Optional environment variables:
COLLECTION_NAME
- Qdrant collection name (default: fegis_memory)AGENT_ID
- Identifier for this agent (default: default-agent)EMBEDDING_MODEL
- Dense embedding model (default: BAAI/bge-small-en)QDRANT_API_KEY
- API key for remote Qdrant (default: empty)
Requirements
- Python 3.13+
- uv package manager
- Docker (for Qdrant)
- MCP-compatible client
License
MIT License - see LICENSE file for details.
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
FEGIS is a Model Context Protocol server that gives LLMs structured, persistent and portable memory through customizable cognitive tools defined in schema.
Related MCP Servers
- AsecurityFlicenseAqualityA Model Context Protocol server that provides access to Figma API functionality, allowing AI assistants like Claude to interact with Figma files, comments, components, and team resources.Last updated -18622
- -securityFlicense-qualityA Model Context Protocol server that connects AI tools and LLMs to Figma designs, enabling them to extract design data, analyze design systems, and generate development documentation.Last updated -1441TypeScript
- -securityFlicense-qualityA Model Context Protocol server that enables Language Learning Models to interact with FogBugz, allowing operations like creating and updating issues, assigning cases, listing open cases, and searching through natural language.Last updated -JavaScript
- -securityFlicense-qualityA Model Context Protocol server that enables AI models to perform function calls through Feishu/Lark messaging platform, using your personal account (no bot configuration needed) to create a full-featured AI assistant.Last updated -110Python