Skip to main content
Glama
Noveum
by Noveum

BridgeML_API

Enable seamless text generation by sending POST requests to the BridgeML API on the API-Market MCP Server. Process user and assistant messages, adjust parameters like temperature and max_tokens, and generate natural language responses for dynamic applications.

Instructions

Make a POST request to bridgeml/codellama/bridgeml/codellama

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
frequency_penaltyNoFrequency penalty value
max_tokensNoMaximum number of tokens to generate
messagesNoList of messages
streamNoFlag indicating if response should be streamed
temperatureNoTemperature for text generation
top_pNoTop P sampling value

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Noveum/api-market-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server