Skip to main content
Glama

FEGIS

Fegis

Fegis does 3 things:

  1. Easy to write tools - Write prompts in YAML format. Tool schemas use flexible natural language instructions.
  2. Structured data from tool calls saved in a vector database - Every tool use is automatically stored in Qdrant with full context.
  3. Search - AI can search through all previous tool usage using semantic similarity, filters, or direct lookup.

Quick Start

# Install uv # Windows winget install --id=astral-sh.uv -e # macOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh # Clone git clone https://github.com/p-funk/fegis.git # Start Qdrant docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant:latest

Configure Claude Desktop

Update claude_desktop_config.json:

{ "mcpServers": { "fegis": { "command": "uv", "args": [ "--directory", "/absolute/path/to/fegis", "run", "fegis" ], "env": { "QDRANT_URL": "http://localhost:6333", "QDRANT_API_KEY": "", "COLLECTION_NAME": "fegis_memory", "EMBEDDING_MODEL": "BAAI/bge-small-en", "ARCHETYPE_PATH": "/absolute/path/to/fegis-wip/archetypes/default.yaml", "AGENT_ID": "claude_desktop" } } } }

Restart Claude Desktop. You'll have 7 new tools available including SearchMemory.

How It Works

1. Tools from YAML

parameters: BiasScope: description: "Range of bias detection to apply" examples: [confirmation, availability, anchoring, systematic, comprehensive] IntrospectionDepth: description: "How deeply to examine internal reasoning processes" examples: [surface, moderate, deep, exhaustive, meta_recursive] tools: BiasDetector: description: "Identify reasoning blind spots, cognitive biases, and systematic errors in AI thinking patterns through structured self-examination" parameters: BiasScope: IntrospectionDepth: frames: identified_biases: type: List required: true reasoning_patterns: type: List required: true alternative_perspectives: type: List required: true

2. Automatic Memory Storage

Every tool invocation gets stored with:

  • Tool name and parameters used
  • Complete input and output
  • Timestamp and session context
  • Vector embeddings for semantic search

3. SearchMemory Tool

"Use SearchMemory and find my analysis of privacy concerns" "Use SearchMemory and what creative ideas did I generate last week?" "Use SearchMemory and show me all UncertaintyNavigator results" "Use SearchMemory and search for memories about decision-making"

Available Archetypes

  • archetypes/default.yaml - Cognitive analysis tools (UncertaintyNavigator, BiasDetector, etc.)
  • archetypes/simple_example.yaml - Basic example tools
  • archetypes/emoji_mind.yaml - Symbolic reasoning with emojis
  • archetypes/slime_mold.yaml - Network optimization tools
  • archetypes/vibe_surfer.yaml - Web exploration tools

Configuration

Required environment variables:

  • ARCHETYPE_PATH - Path to YAML archetype file
  • QDRANT_URL - Qdrant database URL (default: http://localhost:6333)

Optional environment variables:

  • COLLECTION_NAME - Qdrant collection name (default: fegis_memory)
  • AGENT_ID - Identifier for this agent (default: default-agent)
  • EMBEDDING_MODEL - Dense embedding model (default: BAAI/bge-small-en)
  • QDRANT_API_KEY - API key for remote Qdrant (default: empty)

Requirements

  • Python 3.13+
  • uv package manager
  • Docker (for Qdrant)
  • MCP-compatible client

License

MIT License - see LICENSE file for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

FEGIS is a Model Context Protocol server that gives LLMs structured, persistent and portable memory through customizable cognitive tools defined in schema.

  1. Core Components
    1. 1. MCP Server Implementation
    2. 2. Semantic Programming Framework
    3. 3. Hybrid Memory System
  2. How LLMs Process Archetypes
    1. Example Interaction: Cognitive Tools
      1. What Can You Build With Fegis?
        1. Quick Start
          1. Configure Claude Desktop
        2. Learn More
          1. Support Development
            1. License

              Related MCP Servers

              • A
                security
                F
                license
                A
                quality
                A Model Context Protocol server that provides access to Figma API functionality, allowing AI assistants like Claude to interact with Figma files, comments, components, and team resources.
                Last updated -
                18
                62
                2
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that connects AI tools and LLMs to Figma designs, enabling them to extract design data, analyze design systems, and generate development documentation.
                Last updated -
                144
                1
                TypeScript
                • Apple
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that enables Language Learning Models to interact with FogBugz, allowing operations like creating and updating issues, assigning cases, listing open cases, and searching through natural language.
                Last updated -
                JavaScript
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that enables AI models to perform function calls through Feishu/Lark messaging platform, using your personal account (no bot configuration needed) to create a full-featured AI assistant.
                Last updated -
                110
                Python
                • Linux

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/p-funk/FEGIS'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server