Uses Docker to run the Qdrant vector database which stores the persistent memories created by the model.
Provides project sponsorship capabilities through GitHub Sponsors.
Offers support for the project through Ko-fi donations.
Uses YAML configuration files to define custom cognitive frameworks and memory structures for the model.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@FEGISsearch for my previous bias analyses on confirmation bias"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Fegis
Fegis does 3 things:
Easy to write tools - Write prompts in YAML format. Tool schemas use flexible natural language instructions.
Structured data from tool calls saved in a vector database - Every tool use is automatically stored in Qdrant with full context.
Search - AI can search through all previous tool usage using semantic similarity, filters, or direct lookup.
Quick Start
Related MCP server: Figma MCP Server
Configure Claude Desktop
Update claude_desktop_config.json:
Restart Claude Desktop. You'll have 7 new tools available including SearchMemory.
How It Works
1. Tools from YAML
2. Automatic Memory Storage
Every tool invocation gets stored with:
Tool name and parameters used
Complete input and output
Timestamp and session context
Vector embeddings for semantic search
3. SearchMemory Tool
Available Archetypes
archetypes/default.yaml- Cognitive analysis tools (UncertaintyNavigator, BiasDetector, etc.)archetypes/simple_example.yaml- Basic example toolsarchetypes/emoji_mind.yaml- Symbolic reasoning with emojisarchetypes/slime_mold.yaml- Network optimization toolsarchetypes/vibe_surfer.yaml- Web exploration tools
Configuration
Required environment variables:
ARCHETYPE_PATH- Path to YAML archetype fileQDRANT_URL- Qdrant database URL (default: http://localhost:6333)
Optional environment variables:
COLLECTION_NAME- Qdrant collection name (default: fegis_memory)AGENT_ID- Identifier for this agent (default: default-agent)EMBEDDING_MODEL- Dense embedding model (default: BAAI/bge-small-en)QDRANT_API_KEY- API key for remote Qdrant (default: empty)
Requirements
Python 3.13+
uv package manager
Docker (for Qdrant)
MCP-compatible client
License
MIT License - see LICENSE file for details.