The MCP Pyrefly Autotype Server provides AI clients with comprehensive Python type analysis and annotation tools using the Pyrefly engine.
Core Capabilities:
- Analyze Python files: Examine files for missing type annotations with detailed analysis options
- Add type annotations: Automatically infer and add types with backup, safe mode, and aggressive inference options
- Type-check files: Validate annotations and report errors
- Get project context: Retrieve project-wide type information for better inference
- Generate analysis prompts: Create AI-optimized prompts for identifying type annotation needs
- Develop type improvement plans: Formulate comprehensive strategies for enhancing project-wide type coverage
Workflow Support:
- Support integrated AI workflows (analyze → annotate → verify)
- Process completely untyped or legacy codebases
- Enable incremental typing adoption on a file-by-file basis
- Integrate with CI/CD pipelines for type quality validation
- Provide structured JSON output for easy AI consumption
- Respect project configuration via
pyrefly.toml
orpyproject.toml
Allows for integration with GitHub repositories for analyzing and adding type annotations to Python code
Supports testing workflows that integrate with pytest for validating type annotations
Provides automated type annotation, type checking, and analysis for Python files and projects using Pyrefly
Integrates with Ruff for linting Python code as part of the type annotation workflow
[WORK IN PROGRESS AND UNTESTED - USE AT OWN RISK] MCP Pyrefly Autotype Server
A Model Context Protocol (MCP) server that provides automatic Python type annotation using Pyrefly. This server enables LLMs and AI coding assistants to analyze Python code, add type annotations, and perform type checking seamlessly.
What is the Model Context Protocol (MCP)?
The Model Context Protocol (MCP) is an open standard that enables AI assistants and language models to securely access external data sources and tools. MCP servers act as bridges between AI systems and various resources, providing structured access to information and capabilities.
How MCP Works
MCP servers can provide:
- Resources: Static or dynamic data sources (files, databases, APIs)
- Tools: Executable functions that perform actions
- Prompts: Templated prompts for specific tasks
This allows AI assistants to:
- Access real-time information
- Perform complex operations
- Integrate with existing tools and workflows
- Maintain security through controlled access
Features
This MCP server provides comprehensive Python type annotation capabilities:
🔍 Analysis Tools
- File Analysis: Analyze individual Python files for missing type annotations
- Project Context: Get project-wide type information for better inference
- Pyrefly Integration: Leverage Pyrefly's powerful type inference engine
⚡ Type Enhancement
- Automatic Type Addition: Add type annotations using Pyrefly's autotype feature
- File-based Processing: Process individual Python files with type annotations
- Optional Backup: Can create backup files before modification (when requested)
- Project Integration: Respects pyrefly configuration files
✅ Type Checking
- Pyrefly Integration: Validate type annotations using Pyrefly's built-in type checker
- Error Reporting: Basic type checking results and error output
- File-based Validation: Check individual files for type errors
🤖 LLM Integration
- Basic Prompts: Pre-built prompts for type analysis tasks
- Structured Data: JSON-formatted analysis results
- Simple Workflows: Basic analyze → annotate → verify workflows
Why Use This MCP Server?
For LLMs and AI Assistants
- MCP Integration: Works with MCP-compatible AI clients
- JSON Responses: Provides structured data for better decision making
- Basic Context: Simple project structure analysis
- Error Handling: Basic error reporting and graceful failure handling
For Developers
- Cold Start Helper: Assists with completely untyped codebases
- Basic Typing: Simple type annotation workflows
- File Processing: Individual file type checking and annotation
- Tool Integration: Basic integration with existing Python development workflows
Installation
Prerequisites
- Python 3.8 or higher
- uv (fast Python package manager):
pip install uv
or see uv installation guide
Install the MCP Server
Usage
Running the Server
The server can be run directly or integrated with MCP-compatible clients:
Integration with AI Clients
Claude Desktop (Example Configuration)
Add to your Claude Desktop configuration:
VS Code with Copilot
- Install the MCP extension for VS Code
- Configure the server in your workspace settings:
- Create a .vscode/mcp.json file
Make it show up in VS Code (MCP Servers + Copilot Chat Tools)
- Install the “Model Context Protocol (MCP)” extension in VS Code and ensure GitHub Copilot is enabled/updated.
- Save the
.vscode/mcp.json
file shown above in the root of your workspace. - Reload the window: press Ctrl+Shift+P → “Developer: Reload Window”.
- Verify in the MCP Servers view:
- Open the Command Palette (Ctrl+Shift+P) → run “MCP: Show Servers”, or open the “MCP Servers” view from the Activity Bar.
- You should see a server named
pyrefly-autotype
. Status should be Running. If not:- Confirm
uv
is installed and on PATH, and thatuv sync
has been run. - On Windows, you may need to restart VS Code after installing Python/uv.
- Confirm
- Verify in Copilot Chat Tools:
- Open Copilot Chat (Ctrl+I or the Copilot icon).
- In the Tools pane, expand the MCP section. You should see
pyrefly-autotype
listed. If it’s missing, check that:- The workspace is trusted (look for the “Trust” banner in VS Code).
- MCP integration is enabled in Copilot settings.
Run sample queries (inside Copilot Chat)
Try these prompts in a new Copilot Chat tab. Copilot will call the server’s tools for you.
- “Use the pyrefly-autotype MCP server to analyze the file
simple_untyped.py
(detailed=true), then add types to it, and finally type check it. Repeat add→check up to 3 times until type check passes.” - “Analyze
example_untyped.py
for missing annotations, add types with a backup, and run a type check. Summarize changes and remaining warnings.” - “Given the loop in SamplePrompt.md, run the agent loop on
simple_untyped.py
: add_types_to_file → type_check_file, refining up to 3 rounds.”
Expected outcomes:
- Copilot will invoke these MCP tools:
analyze_python_file
,add_types_to_file
,type_check_file
. - The file will be annotated in-place (a backup may be created depending on your request).
- You’ll receive a summary and any remaining non-blocking warnings.
Available Tools
analyze_python_file
Analyze a Python file for missing type annotations.
Parameters:
file_path
(required): Path to the Python filedetailed
(optional): Include detailed analysis information
Example:
add_types_to_file
Add type annotations to a Python file using Pyrefly (this invokes pyrefly autotype
under the hood).
Parameters:
file_path
(required): Path to the Python filebackup
(optional): Create backup before modifying (default: true)
Example:
type_check_file
Run type checking on a Python file using Pyrefly.
Parameters:
file_path
(required): Path to the Python file
Example:
get_project_context
Get project-wide type information for better inference.
Parameters:
project_path
(required): Path to the project directory
Example:
Available Prompts
analyze_typing_needs
Generate analysis prompts for type annotation needs.
type_improvement_plan
Create a comprehensive plan for improving type coverage in a project.
Example Workflows
1. Complete File Type Enhancement
2. Project-Wide Type Analysis
3. Cold Start Type Addition
Use Cases
🥶 Cold Start Projects
- Challenge: Legacy codebases with no type annotations
- Solution: Use Pyrefly autotype with basic MCP integration
- Benefit: Start adding types to untyped codebases
📈 Incremental Typing
- Challenge: Adding types to active projects gradually
- Solution: File-by-file type annotation with basic project context
- Benefit: Gradual type adoption without major disruption
🔧 CI/CD Integration
- Challenge: Maintaining type quality in team projects
- Solution: Basic type checking integration in pipelines
- Benefit: Simple type validation workflows
🤝 LLM-Assisted Development
- Challenge: LLMs need context about typing needs
- Solution: Basic structured analysis data and simple prompts
- Benefit: Improved AI assistance for Python type annotation tasks
Configuration
Pyrefly Configuration
The server respects Pyrefly's configuration. You can configure Pyrefly in your project using either:
pyrefly.toml
file in your project root:
pyproject.toml
file under the[tool.pyrefly]
section:
See the Pyrefly Configuration Documentation for all available options.
Development
Running Tests
Testing the MCP Server
The project includes several test files to verify functionality:
tests/test_server.py
- Comprehensive test suite with mocked pyrefly callstest_direct.py
- Direct testing of server functions with real pyreflytest_demo.py
- Interactive demo showing the complete workflowsimple_untyped.py
- Example file for testing type annotation
To test the server end-to-end:
Code Quality
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Projects
- Pyrefly - The core type inference engine
- Model Context Protocol - The MCP specification
Support
For questions and support:
- Open an issue on GitHub
- Check the Pyrefly documentation
- Review the MCP specification
This MCP server bridges the gap between AI assistants and Python type annotation tools, enabling seamless integration of type enhancement workflows in AI-powered development environments.
Sample Queries and Prompt Library
The sample_queries/
directory contains ready-to-use prompt templates you can paste into your AI client (VS Code with Copilot MCP or Claude Desktop) to drive the server effectively:
sample_queries/PromptWithTools.md
— A compact “agent loop” prompt that instructs the assistant to use the available tools (add_types_to_file
,type_check_file
, and optionallyget_project_context
) and iterate up to 3 refinement rounds. Great for single-file or small feature work.sample_queries/LargeUntypedCodebase.md
— A batch-oriented workflow for incrementally typing a large, mostly-untyped repo. It includes planning, per-file refine loops, batch gates, and progress tracking guidance.
How to use with VS Code + Copilot:
- Open Copilot Chat. Ensure the
pyrefly-autotype
MCP server appears under Tools (see instructions above). - Open one of the markdown files in
sample_queries/
, copy the prompt, and paste it into Copilot Chat. - If the prompt includes tool call JSON examples, Copilot will translate them into MCP tool invocations automatically.
How to use with Claude Desktop:
- Ensure your Claude MCP configuration includes this server (see “Claude Desktop” section above).
- Open a new chat, paste any of the prompts, and follow the agent’s steps. Claude will call the MCP tools using the provided JSON shapes.
Tip: Start with PromptWithTools.md
on a single file (e.g., simple_untyped.py
) to see the full add → check → refine flow end to end. Then progress to LargeUntypedCodebase.md
for multi-file, incremental adoption.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A Model Context Protocol server that enables AI assistants to analyze Python code, add type annotations, and perform type checking using Pyrefly's type inference engine.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that allows AI assistants to interact with Appwrite's API, providing tools to manage databases, users, functions, teams, and other resources within Appwrite projects.Last updated -8450PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server that enables AI assistants like Claude to perform Python development tasks through file operations, code analysis, project management, and safe code execution.Last updated -5Python
- -securityAlicense-qualityA Model Context Protocol (MCP) server implementation that enables AI assistants to interact with Anytype's API through natural language, allowing users to manage their knowledge base through conversation.Last updated -139127TypeScriptMIT License
- -securityAlicense-qualityA collection of Python-based Model Context Protocol servers that extend AI assistant capabilities with tools for calculations, AWS services (S3 and RDS), and PostgreSQL database operations.Last updated -PythonMIT No Attribution