Skip to main content
Glama
SETUP.md2.48 kB
# Setup Guide Complete setup instructions for the Monotype Chat Application. ## Prerequisites 1. **Node.js 18+** - [Download](https://nodejs.org/) 2. **Ollama** - [Download](https://ollama.ai) 3. **Monotype Token** - Your encrypted authentication token ## Step 1: Install Ollama and Pull Model ```bash # Install Ollama (follow instructions on https://ollama.ai) # Pull llama3 model ollama pull llama3 # Verify it's working ollama run llama3 "Hello" ``` ## Step 2: Setup MCP Server ```bash # Navigate to project root cd /path/to/NextGenAgenticAI # Install dependencies npm install ``` The MCP server is ready to use as a plugin! ## Step 3: Setup Backend ```bash cd backend npm install ``` ## Step 4: Setup Frontend ```bash cd frontend npm install ``` ## Step 5: Running the Application ### Terminal 1 - Backend Server ```bash cd backend npm start # Server runs on http://localhost:3001 ``` ### Terminal 2 - Frontend ```bash cd frontend npm run dev # Frontend runs on http://localhost:3000 ``` ## Step 6: Using the Chat UI 1. Open browser to `http://localhost:3000` 2. Enter your Monotype token in the token field 3. Click "Set Token" 4. Start chatting! ### Example Commands: - "What roles are in my company?" - "Invite user@example.com to my company" - "Show me all teams" ## Troubleshooting ### Ollama Not Running ```bash # Check if Ollama is running curl http://localhost:11434/api/tags # Start Ollama if needed ollama serve ``` ### MCP Server Connection Issues - Verify the path to `src/server.js` is correct - Check that `MONOTYPE_TOKEN` environment variable is set - Ensure Node.js 18+ is installed ### Backend Errors - Check that Ollama is running on port 11434 - Verify llama3 model is installed: `ollama list` - Check backend logs for detailed error messages ## Architecture Overview ``` Frontend (React) → Backend (Express + Ollama) → MCP Server → Monotype API ``` - **Frontend**: Chat UI that accepts token and messages - **Backend**: Uses Ollama to detect which MCP tool to call - **MCP Server**: Plugin-ready server that can be used with any MCP client - **Monotype API**: External API for user management ## Using MCP Server as Plugin The MCP server in `src/` can be used independently with any MCP-compatible client: ```json { "mcpServers": { "monotype-mcp": { "command": "node", "args": ["/absolute/path/to/src/server.js"], "env": { "MONOTYPE_TOKEN": "your-token-here" } } } } ```

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/manishgadhock-monotype/monotype-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server