Skip to main content
Glama
Ichigo3766

PowerPoint MCP Server

by Ichigo3766

add-slide-comparison

Create a comparison slide in PowerPoint presentations to contrast two concepts with structured titles and content. Define left and right side details to highlight differences effectively.

Instructions

Add a new a comparison slide with title and comparison content. Use when you wish to compare two concepts

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
left_side_contentYesContent/body text of left concept. Separate main points with a single carriage return character.Make sub-points with tab character.Do not use bullet points, asterisks or dashes for points.Max main points is 4
left_side_titleYesTitle of the left concept
presentation_nameYesName of the presentation to add the slide to
right_side_contentYesContent/body text of right concept. Separate main points with a single carriage return character.Make sub-points with tab character.Do not use bullet points, asterisks or dashes for points.Max main points is 4
right_side_titleYesTitle of the right concept
titleYesTitle of the slide

Implementation Reference

  • Core handler function that adds a comparison slide to the presentation using layout 4 (SLIDE_LAYOUT_COMPARISON), setting title and content in left/right placeholders.
    def add_comparison_slide(self, presentation_name: str, title: str, left_side_title: str, left_side_content: str,
                             right_side_title: str, right_side_content: str ):
        """
        Create a section header slide for the given presentation
    
        Args:
            presentation_name: The presentation to add the slide to
            title: The title of the slide
            left_side_title: The title of the left hand side content
            left_side_content: The body content for the left hand side
            right_side_title: The title of the right hand side content
            right_side_content: The body content for the right hand side
        """
        try:
            prs = self.presentations[presentation_name]
        except KeyError as e:
            raise ValueError(f"Presentation '{presentation_name}' not found")
        slide_master = prs.slide_master
    
        # Add a new slide with layout
        slide_layout = prs.slide_layouts[self.SLIDE_LAYOUT_COMPARISON]
        slide = prs.slides.add_slide(slide_layout)
    
        # Set the title
        title_shape = slide.shapes.title
        title_shape.text = title
    
        # Build the left hand content
        content_shape = slide.placeholders[1]
        text_frame = content_shape.text_frame
        text_frame.text = left_side_title
    
        content_shape = slide.placeholders[2]
        text_frame = content_shape.text_frame
        text_frame.text = left_side_content
    
        # Build the right hand content
        content_shape = slide.placeholders[3]
        text_frame = content_shape.text_frame
        text_frame.text = right_side_title
    
        content_shape = slide.placeholders[4]
        text_frame = content_shape.text_frame
        text_frame.text = right_side_content
        return slide
  • MCP server tool call handler for 'add-slide-comparison': validates arguments, retrieves presentation, calls PresentationManager.add_comparison_slide, and returns success message.
    elif name == "add-slide-comparison":
        # Get arguments
        presentation_name = arguments["presentation_name"]
        title = arguments["title"]
        left_side_title = arguments["left_side_title"]
        left_side_content = arguments["left_side_content"]
        right_side_title = arguments["right_side_title"]
        right_side_content = arguments["right_side_content"]
    
        if not all([presentation_name, title, left_side_title, left_side_content,
                    right_side_title, right_side_content]):
            raise ValueError("Missing required arguments")
    
        if presentation_name not in presentation_manager.presentations:
            raise ValueError(f"Presentation not found: {presentation_name}")
        try:
            slide = presentation_manager.add_comparison_slide(presentation_name, title, left_side_title,
                                                              left_side_content, right_side_title, right_side_content)
        except Exception as e:
            raise ValueError(f"Unable to add comparison slide to {presentation_name}.pptx")
    
        return [types.TextContent(
            type="text",
            text=f"Successfully added comparison slide {title} to {presentation_name}.pptx"
        )]
  • Tool registration in list_tools(): defines name, description, and detailed inputSchema for 'add-slide-comparison' tool.
    types.Tool(
        name="add-slide-comparison",
        description="Add a new a comparison slide with title and comparison content. Use when you wish to "
                    "compare two concepts",
        inputSchema={
            "type": "object",
            "properties": {
                "presentation_name": {
                    "type": "string",
                    "description": "Name of the presentation to add the slide to",
                },
                "title": {
                    "type": "string",
                    "description": "Title of the slide",
                },
                "left_side_title": {
                    "type": "string",
                    "description": "Title of the left concept",
                },
                "left_side_content": {
                    "type": "string",
                    "description": "Content/body text of left concept. "
                                   "Separate main points with a single carriage return character."
                                   "Make sub-points with tab character."
                                   "Do not use bullet points, asterisks or dashes for points."
                                   "Max main points is 4"
                },
                "right_side_title": {
                    "type": "string",
                    "description": "Title of the right concept",
                },
                "right_side_content": {
                    "type": "string",
                    "description": "Content/body text of right concept. "
                                   "Separate main points with a single carriage return character."
                                   "Make sub-points with tab character."
                                   "Do not use bullet points, asterisks or dashes for points."
                                   "Max main points is 4"
                },
            },
            "required": ["presentation_name", "title", "left_side_title", "left_side_content",
                         "right_side_title", "right_side_content"],
        },
    ),
  • Input schema definition for the 'add-slide-comparison' tool, specifying properties and requirements for presentation_name, title, left/right side titles and contents.
    types.Tool(
        name="add-slide-comparison",
        description="Add a new a comparison slide with title and comparison content. Use when you wish to "
                    "compare two concepts",
        inputSchema={
            "type": "object",
            "properties": {
                "presentation_name": {
                    "type": "string",
                    "description": "Name of the presentation to add the slide to",
                },
                "title": {
                    "type": "string",
                    "description": "Title of the slide",
                },
                "left_side_title": {
                    "type": "string",
                    "description": "Title of the left concept",
                },
                "left_side_content": {
                    "type": "string",
                    "description": "Content/body text of left concept. "
                                   "Separate main points with a single carriage return character."
                                   "Make sub-points with tab character."
                                   "Do not use bullet points, asterisks or dashes for points."
                                   "Max main points is 4"
                },
                "right_side_title": {
                    "type": "string",
                    "description": "Title of the right concept",
                },
                "right_side_content": {
                    "type": "string",
                    "description": "Content/body text of right concept. "
                                   "Separate main points with a single carriage return character."
                                   "Make sub-points with tab character."
                                   "Do not use bullet points, asterisks or dashes for points."
                                   "Max main points is 4"
                },
            },
            "required": ["presentation_name", "title", "left_side_title", "left_side_content",
                         "right_side_title", "right_side_content"],
        },
    ),
  • Layout constant used for comparison slides (SLIDE_LAYOUT_COMPARISON = 4).
    SLIDE_LAYOUT_COMPARISON = 4
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations are provided, so the description carries the full burden of behavioral disclosure. The description states 'Add a new comparison slide,' implying a write/mutation operation, but doesn't disclose critical behavioral traits such as whether this requires specific permissions, if it modifies existing presentations, error handling, or what happens on success/failure. For a mutation tool with zero annotation coverage, this is a significant gap.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is extremely concise with two sentences that directly state the purpose and usage guidelines. Every word earns its place, and it's front-loaded with the core functionality. There's no wasted verbiage or redundancy.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the complexity of a 6-parameter mutation tool with no annotations and no output schema, the description is incomplete. It adequately covers purpose and basic usage but lacks behavioral details (e.g., side effects, permissions) and output information. The schema provides parameter documentation, but the description doesn't compensate for the missing behavioral context, making it minimally viable but with clear gaps.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema already documents all 6 parameters thoroughly. The description mentions 'title and comparison content,' which aligns with parameters like 'title,' 'left_side_title,' and 'right_side_title,' but doesn't add meaningful semantics beyond what the schema provides. Baseline 3 is appropriate when the schema does the heavy lifting.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the verb 'Add' and the resource 'comparison slide' with specific content elements (title and comparison content). It distinguishes the tool's purpose from siblings by focusing on comparison slides rather than other slide types like pictures, tables, or charts. However, it doesn't explicitly differentiate from all sibling tools (e.g., 'add-slide-title-content' could be similar).

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides explicit usage guidance with 'Use when you wish to compare two concepts,' which clearly indicates the appropriate context. It doesn't specify when NOT to use this tool or name alternative tools, but the context is sufficiently clear for an agent to understand its purpose relative to the sibling tools listed.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Ichigo3766/powerpoint-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server