Skip to main content
Glama

associate_rubric_with_assignment

Link an existing rubric to a Canvas assignment to establish grading criteria and evaluation standards for student work.

Instructions

Associate an existing rubric with an assignment.

    Args:
        course_identifier: The Canvas course code (e.g., badm_554_120251_246794) or ID
        rubric_id: The ID of the rubric to associate
        assignment_id: The ID of the assignment to associate with
        use_for_grading: Whether to use rubric for grade calculation (default: False)
        purpose: Purpose of the association (grading, bookmark) (default: grading)
    

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
course_identifierYes
rubric_idYes
assignment_idYes
use_for_gradingNo
purposeNograding

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes

Implementation Reference

  • The core handler function that implements the associate_rubric_with_assignment tool. It performs a PUT request to the Canvas API endpoint /courses/{course_id}/rubrics/{rubric_id} to associate the rubric with the specified assignment, handling parameters like use_for_grading and purpose, and returns a formatted success message with details.
    @mcp.tool()
    @validate_params
    async def associate_rubric_with_assignment(course_identifier: str | int,
                                             rubric_id: str | int,
                                             assignment_id: str | int,
                                             use_for_grading: bool = False,
                                             purpose: str = "grading") -> str:
        """Associate an existing rubric with an assignment.
    
        Args:
            course_identifier: The Canvas course code (e.g., badm_554_120251_246794) or ID
            rubric_id: The ID of the rubric to associate
            assignment_id: The ID of the assignment to associate with
            use_for_grading: Whether to use rubric for grade calculation (default: False)
            purpose: Purpose of the association (grading, bookmark) (default: grading)
        """
        course_id = await get_course_id(course_identifier)
        rubric_id_str = str(rubric_id)
        assignment_id_str = str(assignment_id)
    
        # Update the rubric with association
        request_data = {
            "rubric_association": {
                "association_id": assignment_id_str,
                "association_type": "Assignment",
                "use_for_grading": use_for_grading,
                "purpose": purpose
            }
        }
    
        # Make the API request
        response = await make_canvas_request(
            "put",
            f"/courses/{course_id}/rubrics/{rubric_id_str}",
            data=request_data
        )
    
        if "error" in response:
            return f"Error associating rubric with assignment: {response['error']}"
    
        # Get assignment details for confirmation
        assignment_response = await make_canvas_request(
            "get",
            f"/courses/{course_id}/assignments/{assignment_id_str}"
        )
    
        assignment_name = "Unknown Assignment"
        if "error" not in assignment_response:
            assignment_name = assignment_response.get("name", "Unknown Assignment")
    
        course_display = await get_course_code(course_id) or course_identifier
    
        result = "Rubric associated with assignment successfully!\n\n"
        result += f"Course: {course_display}\n"
        result += f"Assignment: {assignment_name} (ID: {assignment_id})\n"
        result += f"Rubric ID: {rubric_id}\n"
        result += f"Used for Grading: {'Yes' if use_for_grading else 'No'}\n"
        result += f"Purpose: {purpose}\n"
    
        return result
  • The call to register_rubric_tools(mcp) in the register_all_tools function, which triggers the registration of the associate_rubric_with_assignment tool among others in the rubrics module.
    register_rubric_tools(mcp)
  • Import of register_rubric_tools from rubrics.py, enabling its use in the top-level tools registration.
    from .rubrics import register_rubric_tools
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden but only states the action ('associate') without disclosing behavioral traits like permission requirements, side effects, or response format. It mentions default values for parameters but doesn't explain what the association entails operationally.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is front-loaded with the core purpose, followed by a structured Args section. It's efficient with minimal waste, though the Args formatting could be slightly more integrated into natural language for optimal flow.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given no annotations, 5 parameters with 0% schema coverage, and an output schema present, the description covers parameters well but lacks behavioral context. It's adequate for basic use but incomplete for a mutation tool without safety or operational details.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters4/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 0%, so the description must compensate. It lists all 5 parameters with brief explanations (e.g., 'Canvas course code or ID', 'ID of the rubric', default values), adding meaningful context beyond the bare schema. However, it doesn't detail format constraints or examples for IDs, keeping it from a 5.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the verb ('associate') and resources ('existing rubric' with 'assignment'), making the purpose evident. However, it doesn't explicitly differentiate from sibling tools like 'create_rubric' or 'update_rubric', which would require a 5.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides no guidance on when to use this tool versus alternatives like 'create_rubric' or 'update_rubric', nor does it mention prerequisites such as existing rubric/assignment IDs. It lacks explicit when/when-not instructions or named alternatives.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vishalsachdev/canvas-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server