Integrates OpenAI's API for hybrid reasoning operations that blend neural language processing with symbolic proof search and knowledge graph querying.
Provides database backend for semantic knowledge storage including RDF triples, lambda abstractions, proof trees, and vector embeddings with pgvector support.
Uses spaCy for natural language processing in semantic decomposition and neuro-symbolic query parsing.
Leverages Supabase PostgreSQL with pgvector for storing and querying RDF triples, proof trees, embeddings, and compositional patterns in a semantic knowledge graph.
Congo River Compositional Intelligence MCP Server
Status: ποΈ Phase 1 in Progress (Foundation Complete!)
A production-grade MCP (Model Context Protocol) server that embodies compositional intelligence principles, providing tools for semantic decomposition, proof search, knowledge graphs, and neuro-symbolic reasoning.
π The Congo River Philosophy
This project implements "Congo River Compositional Intelligence" - the idea that powerful understanding emerges from thousands of tributaries (simple reasoning operations) composing into one massive flow (deep intelligence). Key principles:
Compositional Structure: Complex reasoning built from simple, composable operations
Polyglot Architecture: Each component implemented in its optimal language
Semantic Foundations: Grounded in RDF triples, lambda calculus, and proof theory
Neuro-Symbolic Integration: Bridges neural (LLMs) and symbolic (knowledge graphs) AI
π Quick Start
Prerequisites
Node.js 18+
Python 3.10+
Supabase account (or local PostgreSQL with pgvector)
Anthropic and/or OpenAI API keys
Installation
Configuration
Edit .env with your settings:
Add to Claude Code
Add to your .mcp.json:
π οΈ Available Tools
Core Reasoning Tools
1.
Decomposes concepts into RDF subject-predicate-object triples
Implements Stanley Fish's 3-word sentence principle
Stores in knowledge graph for later querying
2.
Converts processes/code into lambda calculus
Shows compositional structure with type signatures
Applies beta reduction for simplification
3.
Searches for proofs given goals and premises
Multiple strategies: forward/backward chaining, resolution
Returns proof trees (Curry-Howard correspondence)
4.
Queries knowledge graph with SPARQL-like patterns
Natural language or structured queries
Returns matching triples and relationships
5. β Showcase Feature
Hybrid reasoning: LLM + knowledge graph
Parses natural language β logical form
Queries graph symbolically
Synthesizes grounded answers with proof traces
Meta Tools
6.
Analyzes requirements and recommends optimal programming language
Shows scoring rationale and trade-offs
Demonstrates meta-level compositional intelligence
7.
Database management: status, health, migrations, stats
Switches between local/cloud configurations
8.
Exports knowledge graph to RDF or JSON
Backup and portability
9.
Imports triples into knowledge graph
Bulk loading from external sources
10.
Comprehensive system health check
Database stats, service status, tool inventory
π Architecture
ποΈ Database Schema
The PostgreSQL schema includes:
triples- RDF knowledge graph storageproofs- Proof trees and inference tracesreasoning_sessions- Tool invocation historyembeddings- Vector embeddings (pgvector)patterns- Learned compositional patternslambda_abstractions- Lambda calculus representationsconcept_nodes&concept_edges- Meta-level concept graph
π§ Language Selection System
The server includes an automatic language recommendation engine that scores programming languages based on task requirements:
Supported Languages: TypeScript, Python, Prolog, Rust, Go
Scoring Dimensions:
Logic programming capabilities
Graph/RDF operations
Type system strength
Performance characteristics
ML/AI ecosystem
Semantic web support
Concurrency model
Web integration
π Conceptual Foundation
This system is grounded in deep theoretical connections:
J.D. Atlas - Semantic generality and presupposition
Richard Montague - Compositional semantics and type theory
Curry-Howard - Proofs as programs isomorphism
Tim Berners-Lee - RDF and semantic web
Modern LLMs - Neural learning of compositional structure
See: /home/mdz-axolotl/Documents/congo-river-compositional-intelligence.md for the complete theoretical framework.
π― Roadmap
β Phase 1 (Current)
Project structure and configuration
Database schema (PostgreSQL + pgvector)
Database manager (local/cloud support)
Language selection scoring system
Main MCP server with 10 tools
Python services implementation
TypeScript lambda service
Neuro-symbolic integration
End-to-end testing
Phase 2: Enhanced Reasoning
Tree of Thoughts orchestrator
Chain of Thought tracer
Phase 3: Meta-Cognitive Layer
Compositional analyzer (multi-lens analysis)
Loop discovery engine
Phase 4-7: Learning, Production, Knowledge Management, Advanced Neuro-Symbolic
(See full roadmap in /home/mdz-axolotl/.claude/plans/serialized-meandering-starlight.md)
π§ͺ Development
π Example Usage
π€ Contributing
This is a research/educational project exploring compositional intelligence. Contributions welcome!
π License
MIT
**π The Congo River flows with unstoppable force from thousands of tributaries composing into one.**Human: can we save this session?