BridgeML_API
Enable seamless text generation by sending POST requests to the BridgeML API on the API-Market MCP Server. Process user and assistant messages, adjust parameters like temperature and max_tokens, and generate natural language responses for dynamic applications.
Instructions
Make a POST request to bridgeml/codellama/bridgeml/codellama
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| frequency_penalty | No | Frequency penalty value | |
| max_tokens | No | Maximum number of tokens to generate | |
| messages | No | List of messages | |
| stream | No | Flag indicating if response should be streamed | |
| temperature | No | Temperature for text generation | |
| top_p | No | Top P sampling value |