Skip to main content
Glama

DeepView MCP

DeepView MCP

DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

Features

  • Load an entire codebase from a single text file (e.g., created with tools like repomix)
  • Query the codebase using Gemini's large context window
  • Connect to IDEs that support the MCP protocol, like Cursor and Windsurf
  • Configurable Gemini model selection via command-line arguments

Prerequisites

Installation

Installing via Smithery

To install DeepView for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @ai-1st/deepview-mcp --client claude

Using pip

pip install deepview-mcp

Usage

Starting the Server

Note: you don't need to start the server manually. These parameters are configured in your MCP setup in your IDE (see below).

# Basic usage with default settings deepview-mcp [path/to/codebase.txt] # Specify a different Gemini model deepview-mcp [path/to/codebase.txt] --model gemini-2.0-pro # Change log level deepview-mcp [path/to/codebase.txt] --log-level DEBUG

The codebase file parameter is optional. If not provided, you'll need to specify it when making queries.

Command-line Options

  • --model MODEL: Specify the Gemini model to use (default: gemini-2.0-flash-lite)
  • --log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}: Set the logging level (default: INFO)

Using with an IDE (Cursor/Windsurf/...)

  1. Open IDE settings
  2. Navigate to the MCP configuration
  3. Add a new MCP server with the following configuration:
    { "mcpServers": { "deepview": { "command": "/path/to/deepview-mcp", "args": [], "env": { "GEMINI_API_KEY": "your_gemini_api_key" } } } }

Setting a codebase file is optional. If you are working with the same codebase, you can set the default codebase file using the following configuration:

{ "mcpServers": { "deepview": { "command": "/path/to/deepview-mcp", "args": ["/path/to/codebase.txt"], "env": { "GEMINI_API_KEY": "your_gemini_api_key" } } } }

Here's how to specify the Gemini version to use:

{ "mcpServers": { "deepview": { "command": "/path/to/deepview-mcp", "args": ["--model", "gemini-2.5-pro-exp-03-25"], "env": { "GEMINI_API_KEY": "your_gemini_api_key" } } } }
  1. Reload MCP servers configuration

Available Tools

The server provides one tool:

  1. deepview: Ask a question about the codebase
    • Required parameter: question - The question to ask about the codebase
    • Optional parameter: codebase_file - Path to a codebase file to load before querying

Preparing Your Codebase

DeepView MCP requires a single file containing your entire codebase. You can use repomix to prepare your codebase in an AI-friendly format.

Using repomix

  1. Basic Usage: Run repomix in your project directory to create a default output file:
# Make sure you're using Node.js 18.17.0 or higher npx repomix

This will generate a repomix-output.xml file containing your codebase.

  1. Custom Configuration: Create a configuration file to customize which files get packaged and the output format:
npx repomix --init

This creates a repomix.config.json file that you can edit to:

  • Include/exclude specific files or directories
  • Change the output format (XML, JSON, TXT)
  • Set the output filename
  • Configure other packaging options

Example repomix Configuration

Here's an example repomix.config.json file:

{ "include": [ "**/*.py", "**/*.js", "**/*.ts", "**/*.jsx", "**/*.tsx" ], "exclude": [ "node_modules/**", "venv/**", "**/__pycache__/**", "**/test/**" ], "output": { "format": "xml", "filename": "my-codebase.xml" } }

For more information on repomix, visit the repomix GitHub repository.

License

MIT

Author

Dmitry Degtyarev (ddegtyarev@gmail.com)

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

  1. Features
    1. Prerequisites
      1. Installation
        1. Installing via Smithery
        2. Using pip
      2. Usage
        1. Starting the Server
        2. Command-line Options
        3. Using with an IDE (Cursor/Windsurf/...)
      3. Make sure you're using Node.js 18.17.0 or higher
        1. License
        2. Author

      Related MCP Servers

      • -
        security
        F
        license
        -
        quality
        A Model Context Protocol server that enables Claude Desktop to interact with Google's Gemini 2.5 Pro Experimental AI model, with features like Google Search integration and token usage reporting.
        Last updated -
        3
        JavaScript
      • -
        security
        F
        license
        -
        quality
        A Model Context Protocol server that connects AI clients to local code repositories, using Gemini 2.0 Flash to analyze codebases and generate targeted context based on user queries.
        Last updated -
        9
        Python
      • -
        security
        F
        license
        -
        quality
        A Model Context Protocol server that gives Claude access to Google's Gemini 2.5 Pro for extended thinking, code analysis, and problem-solving with a massive context window.
        Last updated -
        5,128
        Python
        • Apple
      • -
        security
        A
        license
        -
        quality
        A Model Context Protocol server that enables Claude to collaborate with Google's Gemini AI models, providing tools for question answering, code review, brainstorming, test generation, and explanations.
        Last updated -
        Python
        MIT License

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/ai-1st/deepview-mcp'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server