The Nuanced MCP Server enables LLMs to analyze and understand code structure through function call graphs in Python repositories.
- Initialize and manage code graphs: Initialize, switch between, and list all initialized repositories
- Analyze function relationships: Retrieve call graphs for specific functions and analyze dependencies between functions or modules
- Assess change impact: Understand the potential effects of modifying specific functions
- Generate prompts: Create prompts for analyzing functions, dependencies, and change impacts
- Access summaries: Retrieve overviews of code graphs and detailed function information
Provides call graph analysis tools for Python codebases, allowing for initialization of code graphs, exploration of function call relationships, dependency analysis, and impact assessment of code changes.
Nuanced MCP Server
A Model Context Protocol (MCP) server that provides call graph analysis capabilities to LLMs through the nuanced library.
Overview
This MCP server enables LLMs to understand code structure by accessing function call graphs through standardized tools and resources. It allows AI assistants to:
- Initialize call graphs for Python repos
- Explore function call relationships
- Analyze dependencies between functions
- Provide more contextually aware code assistance
API
Tools
- initialize_graph
- Initialize a code graph for the given repository path
- Input:
repo_path
(string)
- switch_repository
- Switch to a different initialized repository
- Input:
repo_path
(string)
- list_repositories
- List all initialized repositories
- No inputs required
- get_function_call_graph
- Get the call graph for a specific function
- Inputs:
file_path
(string)function_name
(string)repo_path
(string, optional) - uses active repository if not specified
- analyze_dependencies
- Find all module or file dependencies in the codebase
- Inputs (at least one required):
file_path
(string, optional)module_name
(string, optional)
- analyze_change_impact
- Analyze the impact of changing a specific function
- Inputs:
file_path
(string)function_name
(string)
Resources
- graph://summary
- Get a summary of the currently loaded code graph
- No parameters required
- graph://repo/{repo_path}/summary
- Get a summary of a specific repository's code graph
- Parameters:
repo_path
(string) - Path to the repository
- graph://function/{file_path}/{function_name}
- Get detailed information about a specific function
- Parameters:
file_path
(string) - Path to the file containing the functionfunction_name
(string) - Name of the function to analyze
Prompts
- analyze_function
- Create a prompt to analyze a function with its call graph
- Parameters:
file_path
(string) - Path to the file containing the functionfunction_name
(string) - Name of the function to analyze
- impact_analysis
- Create a prompt to analyze the impact of changing a function
- Parameters:
file_path
(string) - Path to the file containing the functionfunction_name
(string) - Name of the function to analyze
- analyze_dependencies_prompt
- Create a prompt to analyze dependencies of a file or module
- Parameters (at least one required):
file_path
(string, optional) - Path to the file to analyzemodule_name
(string, optional) - Name of the module to analyze
Usage with Claude Desktop
Add this to your claude_desktop_config.json
UV
You must be authenticated.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
An MCP server that enables LLMs to understand and analyze code structure through function call graphs, allowing AI assistants to explore relationships between functions and analyze dependencies in Python repositories.
Related Resources
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that extracts and analyzes Python code structures, focusing on import/export relationships between files to help LLMs understand code context.Last updated -4PythonMIT License
- -securityAlicense-qualityA MCP server that transforms code repositories from GitHub, GitLab, or local directories into LLM-friendly formats, preserving context and structure for better AI processing.Last updated -1PythonApache 2.0
- -security-license-qualityAn MCP server that automatically generates documentation, test plans, and code reviews for code repositories by analyzing directory structures and code files using AI models via OpenRouter API.Last updated -3TypeScriptCreative Commons Zero v1.0 Universal
- -securityAlicense-qualityAn MCP server that analyzes codebases and generates contextual prompts, making it easier for AI assistants to understand and work with code repositories.Last updated -10PythonMIT License