Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Monotype MCP ServerInvite john.doe@example.com to our engineering team"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Monotype MCP Server & Chat Application
A complete system consisting of:
MCP Server - Plugin-ready server for Monotype API integration
Backend - Ollama-powered bridge between chat UI and MCP server
Frontend - React-based chat interface
Architecture
┌─────────────┐ ┌──────────────┐ ┌─────────────┐ ┌──────────────┐
│ Frontend │─────▶│ Backend │─────▶│ MCP Server │─────▶│ Monotype API │
│ (React) │ │ (Ollama) │ │ (Plugin) │ │ │
└─────────────┘ └──────────────┘ └─────────────┘ └──────────────┘Project Structure
NextGenAgenticAI/
├── src/ # MCP Server (can be used as plugin)
│ ├── server.js # Main MCP server
│ ├── api-client.js # Monotype API client
│ ├── auth.js # Authentication service
│ ├── token-decryptor.js # Token decryption utilities
│ └── ...
├── backend/ # Backend server
│ ├── server.js # Express server with Ollama integration
│ └── package.json
├── frontend/ # React chat UI
│ ├── src/
│ │ ├── App.jsx # Main chat component
│ │ └── ...
│ └── package.json
└── README.mdQuick Start
1. MCP Server (Plugin)
The MCP server can be used independently as a plugin with any chat agent.
Setup:
cd src
npm installConfiguration: Add to your MCP client config:
{
"mcpServers": {
"monotype-mcp": {
"command": "node",
"args": ["/path/to/src/server.js"],
"env": {
"MONOTYPE_TOKEN": "your-token-here"
}
}
}
}2. Backend Server
Prerequisites:
Install Ollama: https://ollama.ai
Pull llama3 model:
ollama pull llama3
Setup:
cd backend
npm install
npm startServer runs on http://localhost:3001
3. Frontend
Setup:
cd frontend
npm install
npm run devFrontend runs on http://localhost:3000
Features
MCP Server Tools
invite_user_for_customer- Invite users to your companyget_teams_for_customer- Get all teamsget_roles_for_customer- Get all roles
Backend Intelligence
Uses Ollama (llama3) to detect which tool to call
Extracts parameters from natural language
Fallback keyword matching if Ollama unavailable
Frontend
Secure token input
Modern chat interface
Real-time responses
Tool usage indicators
Usage Examples
Via Chat UI
Start backend and frontend
Enter your token
Try these commands:
"What roles are in my company?"
"Invite user@example.com to my company"
"Show me all teams"
Via MCP Plugin
Use the MCP server directly with any MCP-compatible chat agent (like Cursor, Claude Desktop, etc.)
Development
Running All Services
Terminal 1 - Backend:
cd backend
npm run devTerminal 2 - Frontend:
cd frontend
npm run devTerminal 3 - MCP Server (if testing standalone):
cd src
npm startEnvironment Variables
Backend
MCP_SERVER_PATH- Path to MCP server script (default:../src/server.js)OLLAMA_API_URL- Ollama API URL (default:http://localhost:11434)
MCP Server
MONOTYPE_TOKEN- Your Monotype authentication token (optional, can be set in MCP config)
License
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.