Skip to main content
Glama

MCP AI Gateway

by kinhunt

chat_completion

Send chat completion requests to AI models with configurable parameters like temperature and max_tokens. Returns raw API responses without format conversion for enterprise AI integration.

Instructions

Send a chat completion request to the configured AI API provider (ANTHROPIC). Supports parameters like model, messages, temperature, max_tokens, stream, etc. Returns the raw response from the API without format conversion.

Custom AI model for enterprise use

Input Schema

NameRequiredDescriptionDefault
frequency_penaltyNoPenalizes new tokens based on their frequency
max_tokensNoMaximum number of tokens to generate (default: 4096)
messagesYesArray of message objects with role and content
modelNoModel to use for completion (default: claude-3-sonnet-20240229)
presence_penaltyNoPenalizes new tokens based on whether they appear in the text
response_formatNoFormat of the response (OpenAI only). Supports json_object and json_schema types.
stopNoUp to 4 sequences where the API will stop generating further tokens
streamNoWhether to stream the response
temperatureNoControls randomness in the response (default: 0.7)
top_pNoControls diversity via nucleus sampling

Input Schema (JSON Schema)

{ "properties": { "frequency_penalty": { "description": "Penalizes new tokens based on their frequency", "maximum": 2, "minimum": -2, "type": "number" }, "max_tokens": { "description": "Maximum number of tokens to generate (default: 4096)", "minimum": 1, "type": "number" }, "messages": { "description": "Array of message objects with role and content", "items": { "properties": { "content": { "type": "string" }, "role": { "enum": [ "system", "user", "assistant" ], "type": "string" } }, "required": [ "role", "content" ], "type": "object" }, "type": "array" }, "model": { "description": "Model to use for completion (default: claude-3-sonnet-20240229)", "type": "string" }, "presence_penalty": { "description": "Penalizes new tokens based on whether they appear in the text", "maximum": 2, "minimum": -2, "type": "number" }, "response_format": { "description": "Format of the response (OpenAI only). Supports json_object and json_schema types.", "properties": { "json_schema": { "description": "JSON schema definition (required when type is json_schema)", "properties": { "name": { "description": "Name of the schema", "type": "string" }, "schema": { "description": "JSON schema object", "type": "object" }, "strict": { "description": "Whether to use strict validation", "type": "boolean" } }, "required": [ "name", "schema" ], "type": "object" }, "type": { "description": "The type of response format", "enum": [ "text", "json_object", "json_schema" ], "type": "string" } }, "required": [ "type" ], "type": "object" }, "stop": { "description": "Up to 4 sequences where the API will stop generating further tokens", "oneOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" } ] }, "stream": { "default": false, "description": "Whether to stream the response", "type": "boolean" }, "temperature": { "description": "Controls randomness in the response (default: 0.7)", "maximum": 2, "minimum": 0, "type": "number" }, "top_p": { "description": "Controls diversity via nucleus sampling", "maximum": 1, "minimum": 0, "type": "number" } }, "required": [ "messages" ], "type": "object" }

Other Tools from MCP AI Gateway

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/kinhunt/mcp-ai-gateway'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server