remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Provides deployment support for hosting the MCP server on AWS Lambda, with serverless deployment capabilities.
Integrates with Practera's GraphQL API to allow querying learning data including projects, assessments, milestones, activities, and tasks.
Supports deployment using the Serverless Framework for deploying the MCP server to AWS Lambda.
Practera MCP Server
An MCP (Model Context Protocol) server that provides access to Practera's GraphQL API, allowing AI models to query Practera learning data.
Why Practera MCP?
With this MCP server, you can use LLMs to analyze Practera projects and assessments. For now, this is only available to learning designers (author users).
Here are some examples of how you can use this MCP server:
- Analyze the structure of a project and look for how it can be extended, compressed.
- Restructure the project for different grade levels or different audiences.
- Evaluate the assessments in the project and look for how they can be improved.
- Generate project blueprints and templates.
- Generate assessments and questions
- Create a common cartridge version of a project, or import projects from other LMS data files.
Roadmap
[ ] Support metrics API for generating LLM reports [ ] Support OAuth 2.1 for secure access [ ] Support dynamic creation of assessments, milestones, activities, tasks [ ] Support generation of media assets [ ] Dynamic resource/tool/prompt selection based on project context
Features
- Server-Sent Events (SSE) transport for MCP
- AWS Lambda deployment support
- GraphQL integration with Practera API
- Region-specific endpoints
- API key authentication
- OAuth 2.1 support for secure access
Prerequisites
- Node.js 18+
- npm
- AWS account (for deployment)
- Practera API key
- OAuth client credentials (for OAuth authentication)
Installation
- Clone this repository
- Install dependencies:Copy
Local Development
- Start the server in development mode:Copy
- The server will be available at
http://localhost:3000/sse
- OAuth endpoints will be accessible at
http://localhost:3000/oauth/*
Build
To build the project for deployment:
Deployment to AWS Lambda
- Make sure you have AWS CLI installed and configured.
- Set up your OAuth configuration parameters:Copy
- Deploy using the Serverless Framework:Copy
Authentication Methods
API Key Authentication
For simple integration, you can use API key authentication by providing:
apikey
parameter in each tool callregion
parameter to specify the Practera region
OAuth 2.1 Authentication (coming soon)
The server also supports OAuth 2.1 for secure authentication flows:
- Redirect users to
/oauth/authorize
for authorization - Exchange authorization code for access token at
/oauth/token
- Access the MCP server endpoints using the bearer token
- Revoke tokens if needed at
/oauth/revoke
Available MCP Tools
This server exposes the following MCP tools:
mcp_practera_get_project
- Get details about a Practera projectmcp_practera_get_assessment
- Get details about a Practera assessment
MCP Client Configuration
When connecting to this MCP server from an MCP client, you'll need to provide:
- API key for Practera authentication (if using API key auth)
- Region for the Practera API (usa, aus, euk or p2-stage)
- OAuth configuration (if using OAuth authentication)
Claude Desktop Configuration Example
Example Usage (with Claude)
You can ask Claude to interact with Practera data using the MCP tools:
Claude would then use the mcp_practera_get_project
tool, providing the API key and region from the configuration.
License
MIT License
This server cannot be installed
An integration server that allows AI models to query Practera's GraphQL API for learning data, enabling LLMs to analyze and manipulate educational projects and assessments.