Skip to main content
Glama

get_chapter_summary

Generate chapter summaries from books to understand key content without reading the full text. Input book ID and chapter number to get structured insights.

Instructions

Get the summary for a specific chapter of a book

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
book_idYes
chapter_numberYes

Implementation Reference

  • Handler function for the 'get_chapter_summary' tool, registered via @mcp.tool() decorator. Fetches the summary for a specific chapter using an API request to the LibraLM service.
    @mcp.tool() def get_chapter_summary(book_id: str, chapter_number: int) -> str: """Get the summary for a specific chapter of a book""" try: data = _make_api_request(f"/books/{book_id}/chapters/{chapter_number}") return data.get("summary", "") except Exception as e: raise ValueError( f"Error getting chapter {chapter_number} summary for book '{book_id}': {str(e)}" )
  • Internal helper function that performs authenticated API requests, crucial for the implementation of get_chapter_summary and other tools.
    def _make_api_request(endpoint: str) -> dict: """Make an authenticated request to the LibraLM API""" # Get API key and base URL from request context or environment api_key = get_api_key() base_url = get_api_base_url() headers = {"x-api-key": api_key, "Content-Type": "application/json"} url = f"{base_url}{endpoint}" response = requests.get(url, headers=headers) if response.status_code == 401: raise ValueError("Invalid API key. Please check your LibraLM API key.") elif response.status_code == 404: raise ValueError(f"Resource not found: {endpoint}") elif response.status_code != 200: raise ValueError( f"API request failed with status {response.status_code}: {response.text}" ) # Handle wrapped response format from Lambda result = response.json() if isinstance(result, dict) and "data" in result: return result["data"] return result

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/libralm-ai/libralm_mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server