Enables deploying the MCP server code to GitHub repositories with setup scripts for initialization, remote configuration, and code pushing
Optional integration for setting up CI/CD workflows to test and build the server
Provides configuration instructions for integrating with Claude Desktop app on macOS systems
Supports running the MCP server using Node.js runtime environment
Integrates with Python to access the NSAF framework capabilities
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@NSAF MCP Serverrun NSAF evolution for a predictive maintenance system with 95% accuracy target"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Neuro-Symbolic Autonomy Framework (NSAF) v1.0
The Complete, Unified Implementation of Advanced AI Autonomy
Author: Bolorerdene Bundgaa
Contact: bolor@ariunbolor.org
Website: https://bolor.me
A comprehensive Python framework that combines quantum computing, symbolic reasoning, neural networks, and foundation models into a unified autonomous AI system.
π What's New in v1.0
This is the unified, production-ready version that combines:
β Complete 5-Module Architecture: All advanced NSAF components
β Foundation Model Integration: OpenAI, Anthropic, Google APIs
β MCP Protocol Support: AI assistant integration built-in
β Web API Framework: Production deployment ready
β Enterprise Features: Authentication, databases, monitoring
Related MCP server: CodeAlive MCP
ποΈ Architecture Overview
Core Modules
Quantum-Symbolic Task Clustering - Decompose complex problems using quantum-enhanced algorithms
Self-Constructing Meta-Agents (SCMA) - Evolve specialized AI agents automatically
Hyper-Symbolic Memory - RDF-based knowledge graphs with semantic reasoning
Recursive Intent Projection (RIP) - Multi-step planning and optimization
Human-AI Synergy - Cognitive state synchronization and collaboration
Integration Layers
Foundation Models - GPT-4, Claude, Gemini integration for embeddings and reasoning
MCP Interface - Model Context Protocol for AI assistant integration
Web APIs - FastAPI-based services with authentication
Distributed Computing - Ray-based scaling and quantum backends
π οΈ Installation
Prerequisites
Python 3.8+
8GB+ RAM recommended
GPU optional (for large models)
Quick Install
# Clone the repository
git clone https://github.com/ariunbolor/nsaf-mcp-server.git
cd nsaf-mcp-server
# Install all dependencies
pip install -r requirements.txt
# Run the unified example
python unified_example.pyDependencies Included
Quantum Computing: Qiskit, Cirq, PennyLane
Machine Learning: PyTorch, TensorFlow, Scikit-learn
Distributed: Ray, Redis
Web Framework: FastAPI, WebSockets
Databases: SQLAlchemy, PostgreSQL, Redis
Semantic Web: RDFlib, NetworkX
Foundation Models: OpenAI, Anthropic clients
π― Quick Start
Basic Usage
import asyncio
from core import NeuroSymbolicAutonomyFramework
async def main():
# Initialize the framework
framework = NeuroSymbolicAutonomyFramework()
# Define your task
task = {
'description': 'Build an AI system for predictive maintenance',
'goals': [
{'type': 'accuracy', 'target': 0.95, 'priority': 0.9},
{'type': 'latency', 'target': 50, 'priority': 0.8}
],
'constraints': [
{'type': 'memory', 'limit': '8GB', 'importance': 0.9}
]
}
# Process through NSAF pipeline
result = await framework.process_task(task)
print(f"Clusters: {len(result['task_clusters'])}")
print(f"Agents: {len(result['agents'])}")
await framework.shutdown()
asyncio.run(main())MCP Integration (AI Assistants)
from core import NSAFMCPServer
# Create MCP server for Claude/other AI assistants
server = NSAFMCPServer()
# Available tools:
# - run_nsaf_evolution
# - analyze_nsaf_memory
# - project_nsaf_intent
# - cluster_nsaf_tasks
# - get_nsaf_statusβοΈ Configuration
Environment Variables
# Foundation Models (Optional)
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GOOGLE_API_KEY="your-google-key"
# Databases (Optional)
export DATABASE_PASSWORD="your-db-password"
export REDIS_PASSWORD="your-redis-password"
# Security (Production)
export JWT_SECRET="your-jwt-secret"
export API_KEY="your-api-key"Configuration File
All settings in config/config.yaml:
Foundation model providers and settings
Quantum backend configuration
Distributed computing setup
Database connections
Security and authentication
Feature flags and optimization
π§ͺ Examples
Run Complete Demo
python unified_example.pyShows all features working together with a complex predictive maintenance task.
Individual Components
python example.py # Original NSAF framework
python -m core.mcp_interface # MCP server for AI assistants π§ Advanced Features
Quantum Computing
IBM Qiskit integration for quantum optimization
Configurable quantum backends (simulator/real hardware)
Quantum-enhanced similarity computation
Foundation Models
Multi-provider support (OpenAI, Anthropic, Google)
Automatic fallbacks and error handling
Task-specific model selection
Distributed Processing
Ray-based distributed computing
Auto-scaling worker management
GPU/CPU resource optimization
Enterprise Ready
FastAPI web services
JWT authentication
PostgreSQL/Redis support
Monitoring and logging
Docker deployment ready
π Performance
Component | Performance | Scalability |
Task Clustering | 1000+ tasks/sec | Quantum-enhanced |
Agent Evolution | 100 agents/gen | Distributed training |
Memory Graph | 1M+ nodes | RDF triple store |
Intent Planning | 10 steps/sec | Recursive optimization |
API Response | <100ms | Auto-scaling |
π Security
β API Authentication: JWT tokens and API keys
β Data Encryption: AES-256 encryption at rest
β Secure Connections: HTTPS/WSS only in production
β Access Control: Role-based permissions
β Audit Logging: Comprehensive activity tracking
π§° Development
Testing
pytest tests/ # Run all tests
pytest tests/test_integration.py # Integration tests
pytest --cov=core tests/ # Coverage reportCode Quality
black core/ # Format code
isort core/ # Sort imports
mypy core/ # Type checking
flake8 core/ # LintingDocumentation
sphinx-build docs/ docs/_build/ # Generate docsπ Deployment
Local Development
uvicorn core.web_api:app --reload # Web API server
ray start --head # Distributed computingProduction
docker build -t nsaf . # Container build
docker-compose up -d # Full stack deploymentCloud Platforms
AWS: Ray on EC2, RDS PostgreSQL, ElastiCache Redis
GCP: Compute Engine, Cloud SQL, Memorystore
Azure: Virtual Machines, Database, Cache
π Monitoring
Metrics: Prometheus integration
Logging: Structured JSON logs
Tracing: OpenTelemetry support
Health Checks: Built-in endpoint monitoring
Alerts: Custom threshold notifications
π€ Contributing
Fork the repository
Create feature branch:
git checkout -b feature/amazing-featureRun tests:
pytest tests/Commit changes:
git commit -m 'Add amazing feature'Push branch:
git push origin feature/amazing-featureOpen Pull Request
π Documentation
API Reference:
/docsendpoint when running serverArchitecture Guide:
docs/architecture.mdDeployment Guide:
docs/deployment.mdExamples:
examples/directory
π Troubleshooting
Common Issues
Missing Dependencies
pip install -r requirements.txt # Install all dependenciesQuantum Backend Errors
qiskit-aer-config # Check quantum setupRay Connection Issues
ray start --head # Start Ray cluster
ray status # Check cluster statusFoundation Model API Errors
export OPENAI_API_KEY="your-key" # Set API keysπ License
MIT License - see LICENSE file for details.
π Acknowledgments
IBM Qiskit team for quantum computing framework
Ray team for distributed computing
OpenAI, Anthropic, Google for foundation model APIs
FastAPI team for web framework
All open source contributors
π Support
Issues: GitHub Issues tracker
Discussions: GitHub Discussions
Author Contact: bolor@ariunbolor.org
Website: https://bolor.me
Built with β€οΈ for the future of AI autonomy
Created by Bolorerdene Bundgaa
NSAF v1.0 - The complete neuro-symbolic autonomy solution