notebooklm-mcp-pro
Allows adding Google Drive documents as sources into NotebookLM notebooks for grounded chat and research.
Allows adding YouTube videos as sources into NotebookLM notebooks for grounded chat and research.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@notebooklm-mcp-prolist my notebooks"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
notebooklm-mcp-pro
Production-grade Model Context Protocol server for Google NotebookLM
Connect any MCP-capable client to Google NotebookLM. Works with Claude Desktop, Claude.ai, ChatGPT, Cursor, VS Code Continue, and any client that speaks MCP or OpenAPI.
Documentation ยท Quick Start ยท Tools ยท Integrations
โจ Features
One Python package.
One CLI.
One server factory.
Local stdio transport.
Remote Streamable HTTP transport.
Bearer token authentication.
GitHub OAuth authentication.
ChatGPT Custom Actions through OpenAPI 3.1.
Plugin manifest at
/.well-known/ai-plugin.json.OAuth metadata endpoints.
Notebook tools.
Source ingestion tools.
Chat tools.
Research tools.
Artifact generation tools.
Artifact lifecycle tools.
Language tools.
Admin tools.
ChatGPT-compatible
search.ChatGPT-compatible
fetch.Typed Pydantic inputs.
Typed output models where the MCP surface needs stable shapes.
Tool safety annotations.
Confirmation checks for destructive operations.
SQLite task tracking.
SQLite OAuth sessions.
Structured logging through structlog.
Offline unit tests.
Subprocess stdio integration test.
HTTP auth tests.
OpenAPI tests.
Coverage gate.
Ruff formatting.
Strict mypy.
MkDocs Material documentation.
Docker image.
Docker Compose template.
Railway template.
Fly.io template.
Kubernetes manifests.
Release workflow with wheel, sdist, SBOM, PyPI, and GHCR publishing.
Why this exists
Google NotebookLM is useful for research notebooks, source-grounded chat, study material, and artifact generation.
MCP clients need a stable programmatic bridge.
notebooklm-mcp-pro provides that bridge.
It exposes NotebookLM actions as MCP tools.
It exposes NotebookLM records as MCP resources.
It exposes workflow starters as MCP prompts.
It also exposes an OpenAPI action surface for clients that integrate through HTTP schemas.
๐ฆ Installation
uv
uv tool install notebooklm-mcp-pro
nlm-mcp --versionpip
python -m pip install --upgrade notebooklm-mcp-pro
nlm-mcp --versionpipx
pipx install notebooklm-mcp-pro
nlm-mcp --versionFull optional install
python -m pip install "notebooklm-mcp-pro[all]"From source
git clone https://github.com/oaslananka/notebooklm-mcp-pro
cd notebooklm-mcp-pro
make bootstrap
make testNotebookLM login
Run the NotebookLM browser login once:
notebooklm-py loginThe default auth file is:
~/.config/nlm-mcp/notebooklm_auth.jsonOverride it with:
export NLM_MCP_NOTEBOOKLM_AUTH_FILE=/secure/path/notebooklm_auth.jsonFor containers:
export NLM_MCP_NOTEBOOKLM_AUTH_JSON='{"cookies":[],"origins":[]}'Treat this JSON as a secret.
๐ Quick Start
Local stdio
pip install notebooklm-mcp-pro
notebooklm-py login
nlm-mcp stdioUse this mode for local desktop clients.
It does not add an HTTP auth layer.
The caller process controls access.
Remote HTTP with bearer token
export NLM_MCP_TRANSPORT=http
export NLM_MCP_AUTH_MODE=token
export NLM_MCP_BEARER_TOKEN="$(python -c 'import secrets; print(secrets.token_urlsafe(32))')"
export NLM_MCP_BASE_URL=https://your-server.example.com
nlm-mcp serve --host 0.0.0.0 --port 8080Test:
curl https://your-server.example.com/healthz
curl -H "Authorization: Bearer $NLM_MCP_BEARER_TOKEN" \
https://your-server.example.com/mcpRemote HTTP with GitHub OAuth
export NLM_MCP_TRANSPORT=http
export NLM_MCP_AUTH_MODE=github-oauth
export NLM_MCP_BASE_URL=https://your-server.example.com
export NLM_MCP_GITHUB_CLIENT_ID=your-client-id
export NLM_MCP_GITHUB_CLIENT_SECRET=your-client-secret
export NLM_MCP_OAUTH_ALLOWED_USERS=oaslananka
nlm-mcp serve --host 0.0.0.0 --port 8080Users start at:
https://your-server.example.com/auth/login๐ Integrations
Claude Desktop
Add this to the desktop config file.
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.jsonLinux:
~/.config/Claude/claude_desktop_config.jsonConfig:
{
"mcpServers": {
"notebooklm": {
"command": "nlm-mcp",
"args": ["stdio"],
"env": {
"NLM_MCP_LOG_LEVEL": "WARNING"
}
}
}
}With uvx:
{
"mcpServers": {
"notebooklm": {
"command": "uvx",
"args": ["notebooklm-mcp-pro", "stdio"]
}
}
}Claude.ai Web
Deploy the HTTP server with a public HTTPS URL.
Use:
https://your-server.example.com/mcpChoose bearer token or OAuth based on server configuration.
Run admin.health to verify.
ChatGPT Custom Actions
Deploy the HTTP server.
Import:
https://your-server.example.com/openapi.jsonSet authentication to bearer token when NLM_MCP_AUTH_MODE=token.
The action endpoints are:
POST /tools/{tool_name}The manifest is:
GET /.well-known/ai-plugin.jsonCursor
Use the same local stdio config shape:
{
"mcpServers": {
"notebooklm": {
"command": "nlm-mcp",
"args": ["stdio"]
}
}
}VS Code Continue
Use local stdio or remote HTTP depending on your Continue configuration.
Local command:
nlm-mcp stdioRemote endpoint:
https://your-server.example.com/mcp๐ Tools
Notebook tools
Tool | Purpose | Safety |
| List notebooks | read-only |
| Create a notebook | mutating |
| Get notebook metadata | read-only |
| Rename a notebook | idempotent |
| Delete a notebook | destructive, confirmation required |
| Toggle public sharing | destructive, confirmation required when enabling |
| Invite collaborator | mutating, confirmation required |
| Read sharing settings | read-only |
Source tools
Tool | Purpose | Safety |
| Add a web URL | mutating |
| Add a YouTube video | mutating |
| Upload a local file | mutating |
| Add a Google Drive document | mutating |
| Add pasted text | mutating |
| List sources | read-only |
| Get source metadata | read-only |
| Get indexed text | read-only |
| Re-index a source | idempotent |
| Wait for indexing | read-only, blocking |
| Remove a source | destructive, confirmation required |
Chat tools
Tool | Purpose |
| Ask a one-shot question |
| OpenAPI alias for asking |
| Stream-oriented alias returning a completed result |
| Start or identify a conversation |
| Continue a conversation |
| Read conversation history |
| Save content as a note |
| Alias for note save |
| List notes |
Research tools
Tool | Purpose |
| Start web research |
| Start Drive research |
| Poll research status |
| Wait for research and optionally import sources |
Generation tools
Tool | Output |
| Audio overview |
| Video overview |
| Cinematic video |
| Slide deck |
| Infographic |
| Quiz |
| Flashcards |
| Report |
| Data table |
| Mind map |
Artifact lifecycle tools
Tool | Purpose |
| List artifacts and tracked tasks |
| Poll task status |
| Wait for task completion |
| Download an artifact |
| Delete an artifact when supported |
| Cancel a task when supported |
| Revise one slide |
Language tools
Tool | Purpose |
| List supported languages |
| Read current output language |
| Set account-global output language |
Compatibility tools
Tool | Purpose |
| Return matching record IDs |
| Return full record by ID |
Admin tools
Tool | Purpose |
| Server health |
| Package and runtime version |
โ๏ธ Configuration
Variable | Default | Description |
|
|
|
|
| HTTP bind host |
|
| HTTP bind port |
|
| MCP endpoint path |
| unset | Public URL |
|
|
|
| unset | Token auth secret |
| unset | OAuth client ID |
| unset | OAuth client secret |
| unset | GitHub username allowlist |
|
| NotebookLM auth file |
| unset | Inline NotebookLM auth JSON |
|
| Runtime data directory |
|
| Log level |
|
|
|
See Configuration for the full table.
๐ณ Docker
Build
docker build -f deploy/Dockerfile -t notebooklm-mcp-pro:dev .Run
docker run --rm -p 8080:8080 \
-e NLM_MCP_TRANSPORT=http \
-e NLM_MCP_AUTH_MODE=token \
-e NLM_MCP_BEARER_TOKEN=replace-with-generated-token \
-e NLM_MCP_BASE_URL=http://localhost:8080 \
notebooklm-mcp-pro:devCompose
docker compose -f deploy/docker-compose.yml up --buildPull
docker pull ghcr.io/oaslananka/notebooklm-mcp-pro:latestHTTP Endpoints
Endpoint | Purpose | Auth |
| health check | exempt |
| OpenAPI schema | exempt |
| plugin manifest | exempt |
| OAuth resource metadata | exempt |
| OAuth server metadata | exempt |
| GitHub OAuth login | exempt |
| GitHub OAuth callback | exempt |
| OpenAPI tool action | authenticated |
| Streamable HTTP MCP endpoint | authenticated |
Architecture
flowchart TB
Desktop["Desktop MCP client"] --> Stdio["stdio transport"]
Remote["Remote MCP/OpenAPI client"] --> HTTP["Streamable HTTP"]
HTTP --> Auth["Auth middleware"]
Stdio --> Server["FastMCP server"]
Auth --> Server
Server --> NotebookTools["Notebook tools"]
Server --> SourceTools["Source tools"]
Server --> ArtifactTools["Artifact tools"]
Server --> Resources["MCP resources"]
Server --> Prompts["MCP prompts"]
NotebookTools --> Backend["NotebookLMBackend"]
SourceTools --> Backend
ArtifactTools --> Backend
Backend --> NLM["notebooklm-py"]
NLM --> Google["Google NotebookLM"]
Server --> Store["SQLite task and OAuth store"]๐ Security
Do not expose HTTP mode publicly without auth.
Use bearer tokens for personal deployments.
Use GitHub OAuth for multi-user deployments.
Store NotebookLM auth JSON in a secret manager.
Mount auth files read-only in containers.
Keep
NLM_MCP_BASE_URLon HTTPS for OAuth.Artifact downloads are constrained to the artifacts directory.
Destructive tools require explicit confirmation.
CI runs lint, typecheck, tests, dependency audit, static analysis, and secret scanning.
See Security.
Development
make bootstrap
make lint
make typecheck
make test
make docsGenerate the catalog:
make catalogRun the HTTP server:
make run-httpRun the stdio server:
make run-stdioRelease
Releases are cut from tags:
git tag v1.0.0
git push origin v1.0.0The release workflow validates the tag, builds distributions, generates an SBOM, publishes to PyPI, pushes GHCR images, and creates a GitHub release.
Roadmap
Planned follow-up work:
Additional OAuth providers.
Shared Redis-backed rate limiting.
Hosted UI widgets for richer artifact previews.
More recorded NotebookLM fixtures.
More deployment blueprints.
See docs/ROADMAP.md.
๐ค Contributing
Contributions are welcome when they are scoped, tested, and documented.
Read CONTRIBUTING.md.
Before opening a PR:
make lint
make typecheck
make test
make docsUse Conventional Commits.
๐ License
MIT License.
See LICENSE.
This server cannot be installed
Maintenance
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/oaslananka/notebooklm-mcp-pro'
If you have feedback or need assistance with the MCP directory API, please join our Discord server