Skip to main content
Glama

ask_question

Ask questions to NotebookLM for research and answers based on your documents, with contextual conversations and source citation options.

Instructions

Conversational Research Partner (NotebookLM • Gemini 2.5 • Session RAG)

No Active Notebook

  • Visit https://notebooklm.google to create a notebook and get a share link

  • Use add_notebook to add it to your library (explains how to get the link)

  • Use list_notebooks to show available sources

  • Use select_notebook to set one active

Auth tip: If login is required, use the prompt 'notebooklm.auth-setup' and then verify with the 'get_health' tool. If authentication later fails (e.g., expired cookies), use the prompt 'notebooklm.auth-repair'.

Tip: Tell the user you can manage NotebookLM library and ask which notebook to use for the current task.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
questionYesThe question to ask NotebookLM
session_idNoOptional session ID for contextual conversations. If omitted, a new session is created.
notebook_idNoOptional notebook ID from your library. If omitted, uses the active notebook. Use list_notebooks to see available notebooks.
notebook_urlNoOptional notebook URL (overrides notebook_id). Use this for ad-hoc queries to notebooks not in your library.
show_browserNoShow browser window for debugging (simple version). For advanced control (typing speed, stealth, etc.), use browser_options instead.
source_formatNoFormat for source citation extraction (default: none). Options: - none: No source extraction (fastest) - inline: Insert source text inline: "text [1: source excerpt]" - footnotes: Append sources at the end as footnotes - json: Return sources as separate object in response - expanded: Replace [1] with full quoted source text Note: Source extraction adds ~1-2 seconds but does NOT consume additional NotebookLM quota.
browser_optionsNoOptional browser behavior settings. Claude can control everything: visibility, typing speed, stealth mode, timeouts. Useful for debugging or fine-tuning.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/roomi-fields/notebooklm-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server