Skip to main content
Glama

Monotype MCP Server & Chat Application

A complete system consisting of:

  1. MCP Server - Plugin-ready server for Monotype API integration

  2. Backend - Ollama-powered bridge between chat UI and MCP server

  3. Frontend - React-based chat interface

Architecture

┌─────────────┐ ┌──────────────┐ ┌─────────────┐ ┌──────────────┐ │ Frontend │─────▶│ Backend │─────▶│ MCP Server │─────▶│ Monotype API │ │ (React) │ │ (Ollama) │ │ (Plugin) │ │ │ └─────────────┘ └──────────────┘ └─────────────┘ └──────────────┘

Project Structure

NextGenAgenticAI/ ├── src/ # MCP Server (can be used as plugin) │ ├── server.js # Main MCP server │ ├── api-client.js # Monotype API client │ ├── auth.js # Authentication service │ ├── token-decryptor.js # Token decryption utilities │ └── ... ├── backend/ # Backend server │ ├── server.js # Express server with Ollama integration │ └── package.json ├── frontend/ # React chat UI │ ├── src/ │ │ ├── App.jsx # Main chat component │ │ └── ... │ └── package.json └── README.md

Quick Start

1. MCP Server (Plugin)

The MCP server can be used independently as a plugin with any chat agent.

Setup:

cd src npm install

Configuration: Add to your MCP client config:

{ "mcpServers": { "monotype-mcp": { "command": "node", "args": ["/path/to/src/server.js"], "env": { "MONOTYPE_TOKEN": "your-token-here" } } } }

2. Backend Server

Prerequisites:

  • Install Ollama: https://ollama.ai

  • Pull llama3 model: ollama pull llama3

Setup:

cd backend npm install npm start

Server runs on http://localhost:3001

3. Frontend

Setup:

cd frontend npm install npm run dev

Frontend runs on http://localhost:3000

Features

MCP Server Tools

  • invite_user_for_customer - Invite users to your company

  • get_teams_for_customer - Get all teams

  • get_roles_for_customer - Get all roles

Backend Intelligence

  • Uses Ollama (llama3) to detect which tool to call

  • Extracts parameters from natural language

  • Fallback keyword matching if Ollama unavailable

Frontend

  • Secure token input

  • Modern chat interface

  • Real-time responses

  • Tool usage indicators

Usage Examples

Via Chat UI

  1. Start backend and frontend

  2. Enter your token

  3. Try these commands:

    • "What roles are in my company?"

    • "Invite user@example.com to my company"

    • "Show me all teams"

Via MCP Plugin

Use the MCP server directly with any MCP-compatible chat agent (like Cursor, Claude Desktop, etc.)

Development

Running All Services

Terminal 1 - Backend:

cd backend npm run dev

Terminal 2 - Frontend:

cd frontend npm run dev

Terminal 3 - MCP Server (if testing standalone):

cd src npm start

Environment Variables

Backend

  • MCP_SERVER_PATH - Path to MCP server script (default: ../src/server.js)

  • OLLAMA_API_URL - Ollama API URL (default: http://localhost:11434)

MCP Server

  • MONOTYPE_TOKEN - Your Monotype authentication token (optional, can be set in MCP config)

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/manishgadhock-monotype/monotype-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server