Skip to main content
Glama

NotebookLM MCP Server

by khengyun

🤖 NotebookLM MCP Server

Professional Model Context Protocol (MCP) server for automating interactions with Google's NotebookLM. Features persistent browser sessions, streaming response support, and comprehensive automation capabilities.

Key Features

🚀 Advanced Automation

  • Persistent Browser Sessions - Login once, auto-authenticate forever
  • Streaming Response Support - Proper handling of LLM streaming responses
  • Multiple Chat Methods - Send/receive individually or combined operations
  • Anti-Detection Bypassing - Uses undetected-chromedriver for Google compatibility
  • Smart DOM Interaction - Intelligent selectors with multiple fallbacks
  • Comprehensive Error Handling - Robust fallbacks and detailed logging

💬 Chat Operations

MethodDescriptionStreamingUse Case
send_messageSend chat messageQuick message sending
get_responseGet complete responseWait for full AI response
get_quick_responseGet current responseImmediate response check
chat_with_notebookCombined send + receiveOne-shot conversations

📚 Notebook Management

  • Navigate to specific notebooks
  • Upload documents to notebooks
  • List available notebooks
  • Create new notebooks
  • Search within notebooks
  • Export conversation history

🚀 Quick Start

Installation

# Install from PyPI pip install notebooklm-mcp # Or install from source git clone https://github.com/notebooklm-mcp/notebooklm-mcp.git cd notebooklm-mcp pip install -e .

One-Time Setup

# First run - opens browser for manual login notebooklm-mcp chat --notebook YOUR_NOTEBOOK_ID # Login manually when browser opens # Session automatically saved for future runs ✨

Start MCP Server

# Start server with your notebook notebooklm-mcp server --notebook 4741957b-f358-48fb-a16a-da8d20797bc6 --headless # Or use environment variables export NOTEBOOKLM_NOTEBOOK_ID="your-notebook-id" export NOTEBOOKLM_HEADLESS="true" notebooklm-mcp server

Interactive Chat

# Interactive chat session notebooklm-mcp chat --notebook your-notebook-id # Send single message notebooklm-mcp chat --notebook your-notebook-id --message "Summarize this document"

📖 Usage Examples

Python API

import asyncio from notebooklm_mcp import NotebookLMClient, ServerConfig async def main(): # Configure client config = ServerConfig( default_notebook_id="your-notebook-id", headless=True, debug=True ) client = NotebookLMClient(config) try: # Start browser with persistent session await client.start() # Authenticate (automatic with saved session) await client.authenticate() # Send message and get streaming response await client.send_message("What are the key insights from this document?") response = await client.get_response(wait_for_completion=True) print(f"NotebookLM: {response}") finally: await client.close() asyncio.run(main())

MCP Integration with AutoGen

from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams # Configure MCP server params = StdioServerParams( command="notebooklm-mcp", args=["server", "--notebook", "your-notebook-id", "--headless"] ) # Create MCP workbench workbench = McpWorkbench(params) # Use tools await workbench.call_tool("chat_with_notebook", { "message": "Analyze the main themes in this research paper", "max_wait": 60 })

🛠️ Advanced Configuration

Environment Variables

# Core settings export NOTEBOOKLM_NOTEBOOK_ID="your-notebook-id" export NOTEBOOKLM_HEADLESS="true" export NOTEBOOKLM_DEBUG="false" export NOTEBOOKLM_TIMEOUT="60" # Authentication export NOTEBOOKLM_PROFILE_DIR="./chrome_profile" export NOTEBOOKLM_PERSISTENT_SESSION="true" # Streaming export NOTEBOOKLM_STREAMING_TIMEOUT="60"

📊 MCP Tools Reference

ToolArgumentsDescription
healthcheckNoneServer health status
send_chat_messagemessage: strSend message to NotebookLM
get_chat_responsewait_for_completion: bool, max_wait: intGet response with streaming support
get_quick_responseNoneGet current response immediately
chat_with_notebookmessage: str, max_wait: intCombined send + receive operation
navigate_to_notebooknotebook_id: strNavigate to specific notebook
upload_documentfile_path: strUpload document to notebook
list_notebooksNoneList available notebooks
create_notebooktitle: strCreate new notebook

🔧 Development

Setup Development Environment

# Clone repository git clone https://github.com/notebooklm-mcp/notebooklm-mcp.git cd notebooklm-mcp # Install development dependencies pip install -e ".[dev]" # Install pre-commit hooks pre-commit install

Running Tests

# Run all tests pytest # Run with coverage pytest --cov=notebooklm_mcp # Run only unit tests pytest -m unit # Run integration tests (requires browser) pytest -m integration

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support


⭐ If this project helps you, please give it a star!

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Enables automated interactions with Google's NotebookLM through browser automation. Supports persistent sessions, document uploads, notebook management, and streaming chat responses for AI-powered document analysis.

  1. ✨ Key Features
    1. 🚀 Advanced Automation
    2. 💬 Chat Operations
    3. 📚 Notebook Management
  2. 🚀 Quick Start
    1. Installation
    2. One-Time Setup
    3. Start MCP Server
    4. Interactive Chat
  3. 📖 Usage Examples
    1. Python API
    2. MCP Integration with AutoGen
  4. 🛠️ Advanced Configuration
    1. Environment Variables
  5. 📊 MCP Tools Reference
    1. 🔧 Development
      1. Setup Development Environment
      2. Running Tests
    2. 📄 License
      1. 🆘 Support

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/khengyun/notebooklm-mcp'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server