Integrates with local Ollama models to provide natural language processing for generating, managing, and interacting with educational flashcards and content.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@GIM-MCPCreate a flashcard about the laws of thermodynamics"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
GIM-MCP
Educational interaction management system based on Model Context Protocol (MCP) with Ollama integration for conversational AI capabilities.
π Description
GIM-MCP is an MCP server that provides a RESTful API to manage educational interactions (such as flashcards) with artificial intelligence support. The system uses Ollama for natural language processing and allows for dynamic creation, management, and generation of educational content.
π― Features
MCP Server: Complete Model Context Protocol implementation to expose resources and tools
REST API: HTTP endpoint to interact with the system via chat
Ollama Integration: Natural language processing using local models
Dynamic Interactions: Extensible system to define different types of educational interactions
Flashcard Support: Complete flashcard implementation with text, images, and categories
Zod Validation: Robust validation schemas for editor and renderer
π¦ Prerequisites
Required Software
Node.js (v18 or higher)
Download from: https://nodejs.org/
Ollama (Required for AI functionality)
Windows: Download from https://ollama.com/download
Linux/Mac:
curl -fsSL https://ollama.com/install.sh | sh
Git (to clone the repository)
Ollama Setup
After installing Ollama, download the required model:
Verify that Ollama is running:
The Ollama server should be available at http://localhost:11434
π Installation
Clone the repository
git clone <repository-url> cd gim-mcpInstall dependencies
npm installVerify TypeScript configuration
npm run build
π» Usage
Development Mode
Start the server in development mode with hot-reload:
The server will be available at http://localhost:3000
Production Mode
Build the project
npm run buildStart the compiled server
npm start
Testing the API
You can test the chat endpoint with a POST request:
Or using PowerShell:
π Project Structure
π§ Configuration
Environment Variables (Optional)
You can create a .env file to customize the configuration:
Changing the Ollama Model
Edit the src/orchestrator.ts file:
Recommended models:
llama3.1- Balance between performance and capabilityllama2- Lighter alternativemistral- Fast and efficient option
π οΈ Available Scripts
Command | Description |
| Start the server in development mode |
| Compile TypeScript to JavaScript |
| Run the compiled server |
π§ͺ Testing
The project includes testing support with Vitest:
π MCP Client Integration
This server can be used by any Model Context Protocol-compatible client. Resources and tools are exposed via:
MCP Resources: Accessible through URIs like
interaction://interaction-flashcardMCP Tools: Invokable via
read_interaction_flashcard
π Available Interactions
Flashcard
Interaction to create study cards with:
Text (question/answer)
Supporting images
Organizational categories
Dynamic transformations between editor and renderer
π€ Contributing
Fork the project
Create a feature branch (
git checkout -b feature/AmazingFeature)Commit your changes (
git commit -m 'Add some AmazingFeature')Push to the branch (
git push origin feature/AmazingFeature)Open a Pull Request
π Troubleshooting
Server won't start
Verify that port 3000 is not in use
Make sure you've run
npm install
Ollama not responding
Verify that Ollama is running:
ollama serveCheck that the model is downloaded:
ollama listVerify connectivity:
curl http://localhost:11434/api/tags
TypeScript compilation errors
Make sure TypeScript is installed:
npm install -g typescriptClean and rebuild:
rm -rf dist && npm run build
π License
ISC
π₯ Author
Project developed as part of the GIM system (GestiΓ³n de Interacciones Multimodales - Multimodal Interaction Management)
Note: This project requires Ollama running locally to function correctly. Make sure you have the service active before starting the server.