chat
Send messages to AI models from Anthropic, OpenAI, Google, Mistral, and Together.ai using Bitcoin Lightning payments. Pay per request with prepaid spend tokens without accounts or API keys.
Instructions
Send a message to an AI model via LightningProx. Pay per request with a Lightning spend token. Supports 19 models from Anthropic, OpenAI, Together.ai, Mistral, and Google.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | Model ID (e.g. claude-opus-4-5-20251101, gpt-4-turbo, gemini-2.5-pro, mistral-large-latest, deepseek-ai/DeepSeek-V3) | |
| message | Yes | The user message to send | |
| spend_token | Yes | LightningProx spend token (starts with lnpx_). Get one at lightningprox.com/topup | |
| max_tokens | No | Maximum tokens in response (default: 1024) |
Implementation Reference
- src/index.ts:196-212 (handler)The handler logic for the 'chat' tool inside the MCP server request handler.
case "chat": { const { model, message, spend_token, max_tokens } = args as any; const result = await chat(model, message, spend_token, max_tokens); const content = result.content?.[0]?.text || result.choices?.[0]?.message?.content || JSON.stringify(result); const usage = result.usage ? `\n\n— ${result.usage.input_tokens ?? result.usage.prompt_tokens ?? "?"} in / ${result.usage.output_tokens ?? result.usage.completion_tokens ?? "?"} out` : ""; return { content: [{ type: "text", text: content + usage }], }; } - src/index.ts:112-135 (helper)The helper function that performs the network request to the AI model API.
async function chat( model: string, message: string, spendToken: string, maxTokens: number = 1024 ): Promise<any> { const res = await fetch(`${LIGHTNINGPROX_URL}/v1/messages`, { method: "POST", headers: { "Content-Type": "application/json", "X-Spend-Token": spendToken, }, body: JSON.stringify({ model, messages: [{ role: "user", content: message }], max_tokens: maxTokens, }), }); if (!res.ok) { const err = await res.json() as any; throw new Error(err.error || `LightningProx error: ${res.status}`); } return res.json(); } - src/index.ts:23-50 (schema)Tool definition and schema for the 'chat' tool.
{ name: "chat", description: "Send a message to an AI model via LightningProx. Pay per request with a Lightning spend token. Supports 19 models from Anthropic, OpenAI, Together.ai, Mistral, and Google.", inputSchema: { type: "object", properties: { model: { type: "string", description: "Model ID (e.g. claude-opus-4-5-20251101, gpt-4-turbo, gemini-2.5-pro, mistral-large-latest, deepseek-ai/DeepSeek-V3)", }, message: { type: "string", description: "The user message to send", }, spend_token: { type: "string", description: "LightningProx spend token (starts with lnpx_). Get one at lightningprox.com/topup", }, max_tokens: { type: "number", description: "Maximum tokens in response (default: 1024)", }, }, required: ["model", "message", "spend_token"], }, },