lightningprox-mcp
lightningprox-mcp enables pay-per-request AI model access via Bitcoin Lightning payments — no accounts or API keys required.
Chat with AI models (
chat): Send text or multimodal (vision) prompts to any of 19 AI models across 5 providers (Anthropic, OpenAI, Together.ai, Mistral, Google) using a Lightning spend token.List available models (
list_models): Retrieve all supported models with their IDs, providers, and per-request pricing in satoshis.Check token balance (
get_balance): Query the remaining satoshi balance on a spend token.Generate a Lightning invoice (
generate_invoice): Create a BOLT11 payment request to top up a spend token with a specified amount in satoshis.Check payment status (
check_payment): Verify whether a Lightning invoice has been paid and retrieve the resulting spend token once confirmed.
Enables pay-per-request AI querying via the Bitcoin Lightning Network, allowing users to generate invoices, pay for requests in satoshis, and manage prepaid spend tokens without requiring accounts.
lightningprox-mcp
MCP server for LightningProx — pay-per-request AI via Bitcoin Lightning. No accounts, no API keys. Load a spend token, start querying.
Install
npx lightningprox-mcpWhat LightningProx Is
LightningProx is an AI gateway that accepts Bitcoin Lightning payments instead of API keys. You load a prepaid spend token, pass it in the X-Spend-Token header, and each request is deducted from your balance in sats. No signup, no monthly plan, no credentials to manage.
19 models across 5 providers:
Provider | Models |
Anthropic | claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5 |
OpenAI | gpt-4o, gpt-4-turbo, gpt-4o-mini |
Together.ai | llama-4-maverick, llama-3.3-70b, deepseek-v3, mixtral-8x7b |
Mistral | mistral-large-latest, mistral-medium, mistral-small, codestral, devstral, magistral |
gemini-2.5-flash, gemini-2.5-pro |
Vision / multimodal: Pass image_url directly in your request. URL mode only — no base64 encoding required.
Setup
Claude Desktop
{
"mcpServers": {
"lightningprox": {
"command": "npx",
"args": ["lightningprox-mcp"]
}
}
}Config location:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.jsonLinux:
~/.config/claude/claude_desktop_config.json
Claude Code
claude mcp add lightningprox -- npx lightningprox-mcpTools
Tool | Description |
| Send a prompt to any model, authenticated via spend token. Pass |
| Send a prompt with an image URL for multimodal analysis |
| Check remaining sats on a spend token |
| List available models with per-call pricing |
| Estimate cost in sats for a given model and token count |
| Generate a Lightning invoice to top up a spend token |
Spend Token Auth
Every request authenticates via the X-Spend-Token header:
curl -X POST https://lightningprox.com/v1/chat \
-H "Content-Type: application/json" \
-H "X-Spend-Token: lnpx_your_token_here" \
-d '{
"model": "claude-sonnet-4-6",
"messages": [{"role": "user", "content": "What is the Lightning Network?"}]
}'For vision requests, include image_url in the message content — no base64 needed:
curl -X POST https://lightningprox.com/v1/chat \
-H "Content-Type: application/json" \
-H "X-Spend-Token: lnpx_your_token_here" \
-d '{
"model": "claude-sonnet-4-6",
"messages": [{
"role": "user",
"content": [
{"type": "image_url", "image_url": {"url": "https://example.com/chart.png"}},
{"type": "text", "text": "Describe this chart"}
]
}]
}'Getting a Spend Token
Call
get_invoice(orask_aiwithout a token) to receive a Lightning invoicePay the invoice from any Lightning wallet
Your spend token is returned — use it for all subsequent requests until balance runs out
Endpoints
Endpoint | Description |
| Chat completions — OpenAI-compatible format |
| Anthropic messages format |
| List available models with pricing |
| Check spend token balance |
| Generate Lightning invoice |
Links
Gateway: lightningprox.com
Docs: lightningprox.com/docs
AIProx agent registry: aiprox.dev
Built by LPX Digital Group LLC
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/unixlamadev-spec/lightningprox-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server