Gradescope MCP Server
The Gradescope MCP Server enables AI agents and MCP clients to interact with Gradescope for course management, grading, and AI-assisted workflows through 34 tools, 3 resources, and 7 prompts.
Course & Assignment Management
List all courses (grouped by role), assignments (with dates/status), and detailed assignment info
Rename assignments and modify release, due, and late-due dates
Roster, Extensions & Submissions
View full course roster (students, TAs, instructors) with name, email, SID, and submission count
View, add, or update student extensions
Upload files as submissions, list all submissions, and view a specific student's submission
Grading (Read)
Get the hierarchical question/rubric outline (IDs, weights, prompt text)
Export per-question scores with statistics and breakdowns
View grading progress dashboard (completion % per question)
Get full grading context for a submission (rubric, evaluations, score, comments, navigation)
Inspect rubric items without needing a submission ID; list submissions filterable by grade state
Navigate to the next ungraded question submission
Grading (Write)
Apply grades (rubric items, point adjustments, comments)
Create, update, and delete rubric items (with cascade warnings for high-impact changes)
AI-Assisted Workflows
Prepare and cache grading artifacts (prompt, rubric, reference notes, page URLs) to
/tmp/gradescope-mcpAssess submission readiness for auto-grading with a confidence score (below 0.6 rejected, 0.6–0.8 warns, above 0.8 proceeds)
Cache crop and neighboring pages for scanned/handwritten exams
Prepare a complete answer key for an entire assignment
Generate a smart tiered reading plan (crop → full page → adjacent pages)
Batch Grading via Answer Groups
List AI-clustered answer groups, inspect members and crops, and batch-grade an entire group at once
Regrades & Statistics
List and inspect regrade requests (pending and completed) with student messages and grader responses
Access assignment-level and per-question statistics (mean, median, min/max, std dev)
Safety
All write operations are preview-first, requiring explicit
confirm_write=Truebefore any mutation is executed
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Gradescope MCP Servershow me any pending regrade requests for the midterm"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Gradescope MCP Server
An MCP (Model Context Protocol) server for Gradescope that exposes course management, grading, regrade review, statistics, and AI-assisted grading workflows to MCP clients.
The server is designed for instructors and TAs who want to use AI agents with real Gradescope data while keeping write operations gated behind explicit confirmation.
This repository also includes a reusable local skill at
skills/gradescope-assisted-grading/SKILL.md for human-approved grading
workflows.
Current Status
34 MCP tools
3 MCP resources
7 MCP prompts
30 automated tests
Python 3.10+
Package manager:
uv
What The Project Provides
Read-oriented workflows
Course discovery and assignment listing
Assignment outline parsing for online and scanned-PDF assignments
Roster inspection with a custom HTML parser
Submission listing for multiple assignment types
Grading progress, rubric context, answer groups, regrades, and statistics
Workflow helpers that cache grading artifacts and answer-key snapshots to
/tmp/gradescope-mcp
Write-oriented workflows
Uploading submissions
Setting student extensions
Modifying assignment dates
Renaming assignments
Applying grades
Creating, updating, and deleting rubric items
Batch grading answer groups
All write-capable tools are preview-first and require confirm_write=True
before any mutation is executed.
Tool Inventory
Core
Tool | Description | Access |
| List all courses grouped by role | All |
| List assignments for a course | All |
| Get one assignment's details | All |
| Upload files to an assignment | All |
Instructor / TA Management
Tool | Description |
| Full roster grouped by role |
| View assignment extensions |
| Add or update one student's extension |
| Change release / due / late-due dates |
| Rename an assignment |
| List assignment submissions |
| Read one student's submission content |
| View graders for a question |
Grading Read
Tool | Description |
| Question hierarchy, IDs, weights, prompt text |
| Assignment score export and summary |
| Per-question grading dashboard |
| Full grading context for a question submission |
| Rubric inspection without a submission ID |
| List Question Submission IDs, filterable by grade state |
| Navigate to the next ungraded question submission |
Grading Write
Tool | Description |
| Apply rubric items, comments, and point adjustments |
| Create a rubric item |
| Update a rubric item |
| Delete a rubric item |
AI-Assisted / Workflow Helpers
Tool | Description |
| Save a question-specific grading artifact to |
| Estimate whether auto-grading is safe enough to attempt |
| Download crop and nearby pages to |
| Save assignment-wide answer-key notes to |
| Return a crop-first reading plan |
Answer Groups
Tool | Description |
| List AI-clustered answer groups |
| Inspect one answer group |
| Batch-grade one answer group |
Regrades
Tool | Description |
| List regrade requests |
| Inspect one regrade request |
Statistics
Tool | Description |
| Assignment-level and per-question statistics |
Resources
URI | Description |
| Current course list |
| Assignment list for a course |
| Roster for a course |
Prompts
Prompt | Description |
| Summarize assignment status in a course |
| Guide extension-management work |
| Summarize assignment submission status |
| Draft a rubric from assignment structure |
| Walk through grading one student's work |
| Review pending regrade requests |
| Run a confidence-gated grading workflow for one question |
Architecture
Entry points
src/gradescope_mcp/__main__.py: loads.env, configures logging, runs the FastMCP serversrc/gradescope_mcp/server.py: registers all tools, resources, and prompts
Authentication
src/gradescope_mcp/auth.py: maintains a singletonGSConnectionCredentials come from
GRADESCOPE_EMAILandGRADESCOPE_PASSWORD.envis loaded automatically when starting withpython -m gradescope_mcp
Tool modules
tools/courses.py: course listing and roster parsingtools/assignments.py: assignment listing and assignment write operationstools/submissions.py: uploads, submission listing, grader discoverytools/extensions.py: extension reads and writestools/grading.py: outline parsing, score exports, grading progresstools/grading_ops.py: grading context, writes, rubric CRUD, navigationtools/grading_workflow.py:/tmp/gradescope-mcpartifacts, answer keys, readiness, page caching, smart readingtools/answer_groups.py: AI-assisted answer-group inspection and batch writestools/regrades.py: regrade listing and detail inspectiontools/statistics.py: assignment statisticstools/safety.py: preview-first confirmation helpers for mutations
Important Behavior And Constraints
Write safety
Mutating tools return a preview when
confirm_write=FalseThe actual change only happens with
confirm_write=TrueRubric edits and deletions can cascade to existing grades
tool_grade_answer_groupcan affect many submissions at once and needs extra care
Submission IDs
tool_get_assignment_submissionsreturns assignment-level Global Submission IDsGrading tools require Question Submission IDs
Use
tool_list_question_submissions,tool_get_next_ungraded, or grading context tools to get the correct IDs
Scoring direction
Gradescope questions may be
positiveornegativescoringRubric weights are stored as positive numbers in both modes
The scoring mode determines whether a checked rubric item adds or deducts points
Scanned / handwritten assignments
Structured reference answers are often unavailable
This is expected, not necessarily a parsing failure
The workflow helpers are built to use crop regions, full pages, adjacent pages, rubric text, and user-provided reference notes
Quick Start
1. Prerequisites
Python 3.10+
2. Install
git clone https://github.com/Yuanpeng-Li/gradescope-mcp.git
cd gradescope-mcp
cp .env.example .envThen edit .env with your Gradescope credentials.
3. Run locally
uv run python -m gradescope_mcp4. Configure an MCP client
Example client configuration:
{
"mcpServers": {
"gradescope": {
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/gradescope-mcp",
"python",
"-m",
"gradescope_mcp"
],
"env": {
"GRADESCOPE_EMAIL": "your_email@example.com",
"GRADESCOPE_PASSWORD": "your_password"
}
}
}
}5. Debug with MCP Inspector
npx @modelcontextprotocol/inspector uv run python -m gradescope_mcp6. Run tests
uv run pytest -qAssisted Grading Skill
The repository includes one project-local skill:
gradescope-assisted-grading
It is intended for:
preview-first grading
rubric review before mutation
scanned exam grading
answer-group triage
explicit human approval before any grade write
Install the skill locally
mkdir -p /tmp/gradescope-mcp/skills
ln -s "$(pwd)/skills/gradescope-assisted-grading" /tmp/gradescope-mcp/skills/gradescope-assisted-gradingIf you prefer copying:
mkdir -p /tmp/gradescope-mcp/skills
cp -R skills/gradescope-assisted-grading /tmp/gradescope-mcp/skills/Verify installation
ls /tmp/gradescope-mcp/skills/gradescope-assisted-grading
cat /tmp/gradescope-mcp/skills/gradescope-assisted-grading/SKILL.mdInvoke it from a client with:
Use the gradescope-assisted-grading skill$gradescope-assisted-grading
Project Structure
gradescope-mcp/
├── .env.example
├── AGENT.md
├── DEVLOG.md
├── OPERATIONS_LOGS/
│ └── RECORDS.md
├── README.md
├── pyproject.toml
├── skills/
│ └── gradescope-assisted-grading/
│ └── SKILL.md
├── src/
│ └── gradescope_mcp/
│ ├── __init__.py
│ ├── __main__.py
│ ├── auth.py
│ ├── server.py
│ └── tools/
│ ├── __init__.py
│ ├── answer_groups.py
│ ├── assignments.py
│ ├── courses.py
│ ├── extensions.py
│ ├── grading.py
│ ├── grading_ops.py
│ ├── grading_workflow.py
│ ├── regrades.py
│ ├── safety.py
│ ├── statistics.py
│ └── submissions.py
└── tests/
├── test_answer_groups.py
├── test_assignments_and_grading_ops.py
├── test_extensions_and_answer_key.py
├── test_grading_workflow.py
└── test_write_safety.pyDevelopment Notes
AGENT.mdsummarizes the current architecture and maintenance expectationsDEVLOG.mdrecords the implementation historyOPERATIONS_LOGS/RECORDS.mdis the mutation log template for real-account testing
Known Caveats
Gradescope behavior differs across assignment types; several tools rely on HTML parsing or reverse-engineered endpoints.
Roster parsing uses a custom parser because the upstream library parser is unreliable when sections are present.
Some assignment types do not support the extensions API even for staff users.
Scanned assignments usually do not provide a structured answer key.
Question grading requires Question Submission IDs, not assignment-level Global Submission IDs.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Yuanpeng-Li/gradescope-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server