Provides semantic analysis, AST parsing, and code compaction for C/C++ codebases with intelligent symbol extraction
Enables searching, listing, and analyzing GitHub repositories through cloud integration, providing structured context and graph-based repository analysis
Offers comprehensive semantic analysis, AST parsing, and intelligent code compaction for JavaScript projects with symbol extraction and project navigation
Provides specialized frontend insights and comprehensive analysis for Next.js applications, including architecture detection and React-specific tooling
Supports Node.js project analysis with semantic compaction, dependency analysis, and intelligent navigation hints for server-side JavaScript applications
Integrates with OpenAI APIs for AI-enhanced code analysis, intelligent context generation, detailed code explanations, and embedding-based semantic search
Supports PostgreSQL integration for local development environments when running the full Ambiance server stack
Provides semantic analysis, AST parsing, and intelligent code compaction for Python codebases with comprehensive symbol extraction and project insights
Offers specialized frontend analysis and insights for React applications, including component structure analysis and React-specific architectural patterns
Enables semantic analysis, AST parsing, and code compaction for Rust projects with intelligent symbol extraction and project navigation
Supports Supabase integration for local development environments when running the full Ambiance server stack with local database instances
Provides comprehensive semantic analysis, AST parsing, and intelligent code compaction for TypeScript projects with advanced symbol extraction and type analysis
Ambiance MCP Server
Unlock smarter coding: 60-80% fewer tokens, deeper insights, and seamless IDE integration
Tired of bloated code contexts wasting your AI tokens and slowing down your workflow? Ambiance MCP delivers intelligent, compressed code analysis that slashes token usage by 60-80% while preserving full semantic depth. Get precise context for debugging, understanding, and navigation—offline-ready, multi-language support, and extensible with AI or cloud features. Boost productivity in your IDE without the overhead.
Use as an MCP tool in your IDE or directly from the command line for flexible integration with your development workflow.
Why Ambiance?
Save Tokens & Costs: Semantic compaction means fewer tokens for AI prompts, reducing expenses and speeding up responses.
Deeper Insights Faster: AST parsing and embeddings uncover hidden patterns, helping you debug issues, trace logic, and grasp project architecture in seconds.
Offline Power: Core features work without internet, keeping you productive anywhere.
Seamless Integration: Plug into your IDE for real-time context, with optional AI enhancements for smarter analysis.
Scalable for Any Project: Handles TypeScript, JavaScript, Python, Go, Rust—whether local or GitHub-based.
🚀 Quick Start
1. Install Globally
2. Set Up Embeddings (For Best Results)
In your project directory:
This enables semantic search—takes 2-10 minutes once, then auto-updates on changes.
3. Configure Your IDE
Add this to your IDE's MCP server settings. Set WORKSPACE_FOLDER to your project path.
Windows:
macOS/Linux:
4. Go!
Ambiance auto-activates based on your setup. Add OPENAI_API_KEY for AI boosts or AMBIANCE_API_KEY for GitHub integration.
✨ Core Features & Benefits
Semantic Code Compaction: Shrink contexts by 60-80% without losing meaning—ideal for efficient AI interactions and faster coding.
Project Navigation & Hints: Instantly map your codebase structure, spotting key files and patterns to accelerate onboarding and refactoring.
File & Debug Analysis: Extract symbols, explain code, and pinpoint errors using AST—saving hours on troubleshooting.
Embeddings for Similarity Search: Offline semantic queries find relevant code chunks quickly, enhancing accuracy in large projects.
Multi-Language Support: Works across TypeScript, JavaScript, Python, Go, Rust for versatile development.
🔧 Basic Configuration
Set these environment variables in your IDE config or terminal:
Variable | Purpose | Required? | Default |
| Your project path | Yes | Auto-detects if possible |
| Enable offline semantic search | No |
|
| Unlock AI-powered insights | No | - |
| Access GitHub repos | No | - |
For AI: Add OPENAI_BASE_MODEL=gpt-4 (or your preferred model) and set OPENAI_PROVIDER to target a specific vendor.
For embeddings: Set LOCAL_EMBEDDING_MODEL=all-MiniLM-L6-v2 for customization.
Provider Credentials
AI features now support multiple OpenAI-compatible providers. Set one of the following keys alongside OPENAI_PROVIDER (default: openai):
Provider (
) | Primary Key(s) | Notes |
|
| Supports GPT‑5 responses API with caching metadata |
|
, fallback
| Claude 3.5 / Claude 3 family |
|
, fallback
| OpenRouter aggregated models |
|
or
, fallback
| Grok (xAI) via OpenAI protocol |
|
, fallback
| Groq hosted Llama models |
|
or
, fallback
| Qwen compatible endpoints |
|
, fallback
| Together.ai models |
|
, fallback
| Requires
|
You can also set a default comparison list with AI_COMPARE_MODELS (comma-separated provider:model pairs) for the CLI comparison utility.
Advanced Usage
How Embeddings Supercharge Your Workflow
Embeddings generate in the background on first use (with USE_LOCAL_EMBEDDINGS=true), using AST fallback for immediate results. A file watcher auto-updates them every 3 minutes on changes—efficient and incremental.
Manual control via CLI:
ambiance-mcp embeddings status– Check progress and stats.ambiance-mcp embeddings create --force– Regenerate all.
Available Tools
Use these via your IDE or CLI for targeted analysis.
Core (Offline):
local_context: Compact code for queries like "authentication system".local_project_hints: Get architecture overviews.local_file_summary: Analyze files with symbols.local_debug_context: Debug from error logs.manage_embeddings: Control embeddings.
AI-Enhanced (Needs
ai_get_context: Smarter context with AI.ai_project_hints: Deeper insights.ai_code_explanation: Auto-document code.
Cloud (Needs
ambiance_search_github_repos: Find repos.ambiance_list_github_repos: List yours.ambiance_get_context: Pull repo context.
Command Line Interface
Run tools directly for testing or scripts—no IDE needed.
Key Commands:
ambiance-mcp context --query "How does auth work?" --task-type understandambiance-mcp hints --format json --use-aiambiance-mcp summary src/index.ts --include-symbolsambiance-mcp debug "TypeError: undefined"ambiance-mcp grep "function $NAME($ARGS)" --language typescriptambiance-mcp compare --prompt "Summarize the new release notes" --models openai:gpt-5,anthropic:claude-3-5-sonnet-latest
Global options: --project-path, --format json, --output file.json, --verbose.
For full options, run ambiance-mcp --help.
📖 More Docs
Source & contributions: https://github.com/sbarron/AmbianceMCP
Detailed CLI:
ambiance-mcp --help --expanded
**Change Log: Version 0.2.4" feat: Major enhancements to embedding management, AI tools, and frontend analysis
Embedding Management & Automation:
Added CLI controls for manual start/stop of automated embeddings updates
Enhanced automatic indexing system with improved background processing
Refactored embedding storage to resolve SQLite memory leak issues
AI Tools Enhancement:
Improved AI-powered project insights with better pattern detection
Enhanced semantic compaction for more efficient code analysis
Updated analysis, explanation, and insights prompt templates
Strengthened local context processing with enhanced semantic understanding
Frontend Analysis Improvements:
Enhanced frontend_insights with better styling file filtering
Added composition analysis for file types in frontend components
Improved environment detection and component analysis capabilities
Infrastructure Updates:
Streamlined CLI documentation with simplified installation instructions
Enhanced tool helper utilities and database evidence processing
Improved project hints functionality for better codebase navigation
**Change Log: Version 0.2.5" feat: Expanded AI provider support, multi-model comparison tool, enhanced debug context analysis
AI Provider Expansion:
Added support for
openrouter,grok, andgroqprovidersImplemented provider-specific API key environment variable priority system
Enhanced provider configuration with fallback API key support
Multi-Model Comparison Tool:
New
compareCLI command for side-by-side AI model evaluationSupport for comparing multiple providers and models with the same prompt
Performance metrics, usage statistics, and response comparison
Configurable temperature, max tokens, and system prompts
Debug Context Enhancements:
Improved error context processing with focused embedding queries
Enhanced symbol matching and error type detection
Better semantic relevance ranking for debug assistance
Embedding Management & Automation:
Added CLI controls for manual start/stop of automated embeddings updates
Fixed SQLite memory leak issues in embedding storage
📄 License
MIT – See LICENSE.
This server cannot be installed