Skip to main content
Glama

unichat-ts-mcp-server

Unichat MCP Server in TypeScript

Also available in Python

Send requests to OpenAI, MistralAI, Anthropic, xAI, Google AI or DeepSeek using MCP protocol via tool or predefined prompts. Vendor API key required.

Both STDIO and SSE transport mechanisms supported via arguments.

Tools

The server implements one tool:

  • unichat: Send a request to unichat
    • Takes "messages" as required string arguments
    • Returns a response

Prompts

  • code_review
    • Review code for best practices, potential issues, and improvements
    • Arguments:
      • code (string, required): The code to review"
  • document_code
    • Generate documentation for code including docstrings and comments
    • Arguments:
      • code (string, required): The code to comment"
  • explain_code
    • Explain how a piece of code works in detail
    • Arguments:
      • code (string, required): The code to explain"
  • code_rework
    • Apply requested changes to the provided code
    • Arguments:
      • changes (string, optional): The changes to apply"
      • code (string, required): The code to rework"

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Running evals

The evals package loads an mcp client that then runs the index.ts file, so there is no need to rebuild between tests. You can load environment variables by prefixing the npx command. Full documentation can be found here.

OPENAI_API_KEY=your-key npx mcp-eval src/evals/evals.ts src/server.ts

Installation

Installing via Smithery

To install Unichat MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install unichat-ts-mcp-server --client claude

Installing manually

To use with Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Run locally:

{ "mcpServers": { "unichat-ts-mcp-server": { "command": "node", "args": [ "{{/path/to}}/unichat-ts-mcp-server/build/index.js" ], "env": { "UNICHAT_MODEL": "YOUR_PREFERRED_MODEL_NAME", "UNICHAT_API_KEY": "YOUR_VENDOR_API_KEY" } } }

Run published:

{ "mcpServers": { "unichat-ts-mcp-server": { "command": "npx", "args": [ "-y", "unichat-ts-mcp-server" ], "env": { "UNICHAT_MODEL": "YOUR_PREFERRED_MODEL_NAME", "UNICHAT_API_KEY": "YOUR_VENDOR_API_KEY" } } }

Runs in STDIO by default or with argument --stdio. To run in SSE add argument --sse

npx -y unichat-ts-mcp-server --sse

Supported Models:

A list of currently supported models to be used as "YOUR_PREFERRED_MODEL_NAME" may be found here. Please make sure to add the relevant vendor API key as "YOUR_VENDOR_API_KEY"

Example:

"env": { "UNICHAT_MODEL": "gpt-4o-mini", "UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY" }

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

If you experience timeouts during testing in SSE mode change the request URL on the inspector interface to: http://localhost:3001/sse?timeout=600000

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Tools

Send requests to OpenAI, MistralAI, Anthropic, xAI, or Google AI using MCP protocol via tool or predefined prompts. Vendor API key required.

Both STDIO and SSE transport mechanisms are supported via arguments.

  1. Tools
    1. Prompts
      1. Development
        1. Running evals
          1. Installation
            1. Installing via Smithery
            2. Installing manually
            3. Debugging

          Related MCP Servers

          • A
            security
            A
            license
            A
            quality
            Send requests to OpenAI, MistralAI, Anthropic, xAI, or Google AI using MCP protocol via tool or predefined prompts. Vendor API key required
            Last updated -
            1
            36
            Python
            MIT License
            • Apple
          • -
            security
            F
            license
            -
            quality
            A dynamic proxy that converts OpenAPI Specification (OAS) endpoints into Message Communication Protocol (MCP) tools, allowing AI agents to use existing REST APIs as if they were native MCP tools without manual implementation.
            Last updated -
            16
            TypeScript
          • -
            security
            F
            license
            -
            quality
            An auto-generated MCP server that enables interaction with the OpenAI API, allowing users to access OpenAI's models and capabilities through the Multi-Agent Conversation Protocol.
            Last updated -
            Python
          • -
            security
            F
            license
            -
            quality
            A service that converts OpenAPI specifications into MCP tools, enabling AI assistants to interact with your API endpoints through natural language.
            Last updated -
            Python

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/amidabuddha/unichat-ts-mcp-server'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server