The MCP server provides integration with VS Code's development tools for working with GitHub repositories, facilitating code navigation, analysis, and manipulation capabilities when using AI coding assistants.
Bifrost - VSCode Dev Tools MCP Server
This VS Code extension provides a Model Context Protocol (MCP) server that exposes VSCode's powerful development tools and language features to AI tools. It enables advanced code navigation, analysis, and manipulation capabilities when using AI coding assistants that support the MCP protocol.
Features
Language Server Integration: Access VSCode's language server capabilities for any supported language
Code Navigation: Find references, definitions, implementations, and more
Symbol Search: Search for symbols across your workspace
Code Analysis: Get semantic tokens, document symbols, and type information
Smart Selection: Use semantic selection ranges for intelligent code selection
Code Actions: Access refactoring suggestions and quick fixes
HTTP/SSE Server: Exposes language features over an MCP-compatible HTTP server
AI Assistant Integration: Ready to work with AI assistants that support the MCP protocol
Usage
Cline Installation
Step 1. Install Supergateway
Step 2. Add config to cline
Step 3. It will show up red but seems to work fine
Windows Config
Mac/Linux Config
Roo Code Installation
Step 1: Add the SSE config to your global or project-based MCP configuration
Follow this video to install and use with cursor. I have also provided sample rules that can be used in .cursorrules files for better results.
FOR NEW VERSIONS OF CURSOR, USE THIS CODE
Multiple Project Support
When working with multiple projects, each project can have its own dedicated MCP server endpoint and port. This is useful when you have multiple VS Code windows open or are working with multiple projects that need language server capabilities.
Project Configuration
Create a bifrost.config.json
file in your project root:
The server will use this configuration to:
Create project-specific endpoints (e.g.,
http://localhost:5642/my-project/sse
)Provide project information to AI assistants
Use a dedicated port for each project
Isolate project services from other running instances
Example Configurations
Backend API Project:
Frontend Web App:
Port Configuration
Each project should specify its own unique port to avoid conflicts when multiple VS Code instances are running:
The
port
field inbifrost.config.json
determines which port the server will useIf no port is specified, it defaults to 8008 for backwards compatibility
Choose different ports for different projects to ensure they can run simultaneously
The server will fail to start if the configured port is already in use, requiring you to either:
Free up the port
Change the port in the config
Close the other VS Code instance using that port
Connecting to Project-Specific Endpoints
Update your AI assistant configuration to use the project-specific endpoint and port:
Backwards Compatibility
If no bifrost.config.json
is present, the server will use the default configuration:
Port: 8008
SSE endpoint:
http://localhost:8008/sse
Message endpoint:
http://localhost:8008/message
This maintains compatibility with existing configurations and tools.
Available Tools
The extension provides access to many VSCode language features including:
find_usages: Locate all symbol references.
go_to_definition: Jump to symbol definitions instantly.
find_implementations: Discover implementations of interfaces/abstract methods.
get_hover_info: Get rich symbol docs on hover.
get_document_symbols: Outline all symbols in a file.
get_completions: Context-aware auto-completions.
get_signature_help: Function parameter hints and overloads.
get_rename_locations: Safely rename symbols across the project.
get_code_actions: Quick fixes, refactors, and improvements.
get_semantic_tokens: Enhanced highlighting data.
get_call_hierarchy: See incoming/outgoing call relationships.
get_type_hierarchy: Visualize class and interface inheritance.
get_code_lens: Inline insights (references, tests, etc.).
get_selection_range: Smart selection expansion for code blocks.
get_type_definition: Jump to underlying type definitions.
get_declaration: Navigate to symbol declarations.
get_document_highlights: Highlight all occurrences of a symbol.
get_workspace_symbols: Search symbols across your entire workspace.
Requirements
Visual Studio Code version 1.93.0 or higher
Appropriate language extensions for the languages you want to work with (e.g., C# extension for C# files)
Installation
Install this extension from the VS Code marketplace
Install any language-specific extensions you need for your development
Open your project in VS Code
Usage
The extension will automatically start an MCP server when activated. To configure an AI assistant to use this server:
The server runs on port 8008 by default
Configure your MCP-compatible AI assistant to connect to:
SSE endpoint:
http://localhost:8008/sse
Message endpoint:
http://localhost:8008/message
Available Commands
Bifrost MCP: Start Server
- Manually start the MCP server on port 8008Bifrost MCP: Start Server on port
- Manually start the MCP server on specified portBifrost MCP: Stop Server
- Stop the running MCP serverBifrost MCP: Open Debug Panel
- Open the debug panel to test available tools
Star History
Example Tool Usage
Find References
Workspace Symbol Search
Debugging
Use the MCP: Open Debug Panel
command
Troubleshooting
If you encounter issues:
Ensure you have the appropriate language extensions installed for your project
Check that your project has loaded correctly in VSCode
Verify that port 8008 is available on your system
Check the VSCode output panel for any error messages
Contributing
Here are Vscodes commands if you want to add additional functionality go ahead. I think we still need rename and a few others. Please feel free to submit issues or pull requests to the GitHub repository.
License
This extension is licensed under the APGL-3.0 License.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
VSCode Extension that exposes semantic tools like Find Usages and Rename to LLMs via an MCP server
- Features
- Usage
- Multiple Project Support
- Available Tools
- Requirements
- Installation
- Usage
- Star History
- Example Tool Usage
- Debugging
- Troubleshooting
- Contributing
- License
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityShare code context with LLMs via MCP or clipboardLast updated -6277Apache 2.0
- -securityFlicense-qualityAn MCP server that connects to your Notion knowledge base, allowing you to query and retrieve information directly from VSCode using the Cline extension.Last updated -
- -securityAlicense-qualityAn MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.Last updated -766MIT License
- MIT License