# GIM-MCP
Educational interaction management system based on Model Context Protocol (MCP) with Ollama integration for conversational AI capabilities.
## π Description
GIM-MCP is an MCP server that provides a RESTful API to manage educational interactions (such as flashcards) with artificial intelligence support. The system uses Ollama for natural language processing and allows for dynamic creation, management, and generation of educational content.
## π― Features
- **MCP Server**: Complete Model Context Protocol implementation to expose resources and tools
- **REST API**: HTTP endpoint to interact with the system via chat
- **Ollama Integration**: Natural language processing using local models
- **Dynamic Interactions**: Extensible system to define different types of educational interactions
- **Flashcard Support**: Complete flashcard implementation with text, images, and categories
- **Zod Validation**: Robust validation schemas for editor and renderer
## π¦ Prerequisites
### Required Software
1. **Node.js** (v18 or higher)
- Download from: https://nodejs.org/
2. **Ollama** (Required for AI functionality)
- **Windows**: Download from https://ollama.com/download
- **Linux/Mac**:
```bash
curl -fsSL https://ollama.com/install.sh | sh
```
3. **Git** (to clone the repository)
### Ollama Setup
After installing Ollama, download the required model:
```bash
ollama pull llama3.1
```
Verify that Ollama is running:
```bash
ollama list
```
The Ollama server should be available at `http://localhost:11434`
## π Installation
1. **Clone the repository**
```bash
git clone <repository-url>
cd gim-mcp
```
2. **Install dependencies**
```bash
npm install
```
3. **Verify TypeScript configuration**
```bash
npm run build
```
## π» Usage
### Development Mode
Start the server in development mode with hot-reload:
```bash
npm run dev
```
The server will be available at `http://localhost:3000`
### Production Mode
1. **Build the project**
```bash
npm run build
```
2. **Start the compiled server**
```bash
npm start
```
### Testing the API
You can test the chat endpoint with a POST request:
```bash
curl -X POST http://localhost:3000/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "Create a flashcard about mathematics"}'
```
Or using PowerShell:
```powershell
Invoke-RestMethod -Uri "http://localhost:3000/api/chat" `
-Method POST `
-ContentType "application/json" `
-Body '{"message": "Create a flashcard about mathematics"}'
```
## π Project Structure
```
gim-mcp/
βββ src/
β βββ api.ts # Express REST API
β βββ server.ts # Main MCP server
β βββ orchestrator.ts # Orchestration logic with Ollama
β βββ mcp-client-local.ts # Local MCP client
β βββ mcp/
β β βββ mcp-tool.types.ts # MCP tool types
β βββ prompts/
β β βββ instrucciones-gim.ts # System prompts
β βββ resources/
β β βββ index.ts # Resource exports
β β βββ interaction-base-types.ts # Base types
β β βββ mcp-resource.types.ts # MCP resource types
β β βββ flashcard/ # Flashcard implementation
β β β βββ flashcard.editor.schema.ts
β β β βββ flashcard.renderer.schema.ts
β β β βββ flashcard.resource.ts
β β β βββ flashcard.transform.spec.ts
β β β βββ index.ts
β β βββ interactions-index/
β β βββ interactions-index.resource.ts
β βββ types/ # Additional TypeScript types
β βββ utils/ # Utilities
βββ dist/ # Compiled files
βββ package.json
βββ tsconfig.json
βββ README.md
```
## π§ Configuration
### Environment Variables (Optional)
You can create a `.env` file to customize the configuration:
```env
# API server port
PORT=3000
# Ollama URL
OLLAMA_URL=http://localhost:11434
# Ollama model to use
OLLAMA_MODEL=llama3.1
```
### Changing the Ollama Model
Edit the [src/orchestrator.ts](src/orchestrator.ts) file:
```typescript
const MODEL = "llama3.1"; // Change to your preferred model
```
Recommended models:
- `llama3.1` - Balance between performance and capability
- `llama2` - Lighter alternative
- `mistral` - Fast and efficient option
## π οΈ Available Scripts
| Command | Description |
|---------|-------------|
| `npm run dev` | Start the server in development mode |
| `npm run build` | Compile TypeScript to JavaScript |
| `npm start` | Run the compiled server |
## π§ͺ Testing
The project includes testing support with Vitest:
```bash
npm test
```
## π MCP Client Integration
This server can be used by any Model Context Protocol-compatible client. Resources and tools are exposed via:
- **MCP Resources**: Accessible through URIs like `interaction://interaction-flashcard`
- **MCP Tools**: Invokable via `read_interaction_flashcard`
## π Available Interactions
### Flashcard
Interaction to create study cards with:
- Text (question/answer)
- Supporting images
- Organizational categories
- Dynamic transformations between editor and renderer
## π€ Contributing
1. Fork the project
2. Create a feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
## π Troubleshooting
### Server won't start
- Verify that port 3000 is not in use
- Make sure you've run `npm install`
### Ollama not responding
- Verify that Ollama is running: `ollama serve`
- Check that the model is downloaded: `ollama list`
- Verify connectivity: `curl http://localhost:11434/api/tags`
### TypeScript compilation errors
- Make sure TypeScript is installed: `npm install -g typescript`
- Clean and rebuild: `rm -rf dist && npm run build`
## π License
ISC
## π₯ Author
Project developed as part of the GIM system (GestiΓ³n de Interacciones Multimodales - Multimodal Interaction Management)
---
**Note**: This project requires Ollama running locally to function correctly. Make sure you have the service active before starting the server.