Skip to main content
Glama

WolframAlpha LLM MCP Server

by Garoth

WolframAlpha LLM MCP Server

A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation

Features

  • Query WolframAlpha's LLM API with natural language questions
  • Answer complicated mathematical questions
  • Query facts about science, physics, history, geography, and more
  • Get structured responses optimized for LLM consumption
  • Support for simplified answers and detailed responses with sections

Available Tools

  • ask_llm: Ask WolframAlpha a question and get a structured llm-friendly response
  • get_simple_answer: Get a simplified answer
  • validate_key: Validate the WolframAlpha API key

Installation

git clone https://github.com/Garoth/wolframalpha-llm-mcp.git npm install

Configuration

  1. Get your WolframAlpha API key from developer.wolframalpha.com
  2. Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):
{ "mcpServers": { "wolframalpha": { "command": "node", "args": ["/path/to/wolframalpha-mcp-server/build/index.js"], "env": { "WOLFRAM_LLM_APP_ID": "your-api-key-here" }, "disabled": false, "autoApprove": [ "ask_llm", "get_simple_answer", "validate_key" ] } } }

Development

Setting Up Tests

The tests use real API calls to ensure accurate responses. To run the tests:

  1. Copy the example environment file:
    cp .env.example .env
  2. Edit .env and add your WolframAlpha API key:
    WOLFRAM_LLM_APP_ID=your-api-key-here
    Note: The .env file is gitignored to prevent committing sensitive information.
  3. Run the tests:
    npm test

Building

npm run build

License

MIT

Deploy Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

WolframAlpha の LLM API に対して自然言語の質問を照会できるようにし、LLM の使用に最適化された構造化された簡略化された回答を提供します。

  1. 特徴
    1. 利用可能なツール
      1. インストール
        1. 構成
          1. 発達
            1. テストの設定
            2. 建物
          2. ライセンス

            Related MCP Servers

            • A
              security
              A
              license
              A
              quality
              Enables querying documents through a Langflow backend using natural language questions, providing an interface to interact with Langflow document Q\&A flows.
              Last updated -
              1
              14
              MIT License
              • Apple
            • -
              security
              F
              license
              -
              quality
              Enables LLMs to interact with DataForSEO and other SEO APIs through natural language, allowing for keyword research, SERP analysis, backlink analysis, and local SEO tasks.
              Last updated -
              1,394
              49
            • -
              security
              A
              license
              -
              quality
              Enables LLM-based agents to interact with FHIR healthcare data through natural language prompts, providing full CRUD operations on FHIR resources, document processing, and semantic search capabilities.
              Last updated -
              22
              MIT License
              • Apple
              • Linux
            • -
              security
              F
              license
              -
              quality
              Enables dynamic database querying through natural language questions using LLM-powered parameter extraction and template-based SQL generation. Supports flexible configuration for various domains and databases with automated response formatting.
              Last updated -

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/Garoth/wolframalpha-llm-mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server