WolframAlpha LLM MCP Server
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Provides access to WolframAlpha's LLM API, allowing users to query WolframAlpha with natural language questions, answer mathematical questions, and retrieve factual information about science, physics, history, geography, and more.
WolframAlpha LLM MCP Server
A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation
Features
- Query WolframAlpha's LLM API with natural language questions
- Answer complicated mathematical questions
- Query facts about science, physics, history, geography, and more
- Get structured responses optimized for LLM consumption
- Support for simplified answers and detailed responses with sections
Available Tools
ask_llm
: Ask WolframAlpha a question and get a structured llm-friendly responseget_simple_answer
: Get a simplified answervalidate_key
: Validate the WolframAlpha API key
Installation
Configuration
- Get your WolframAlpha API key from developer.wolframalpha.com
- Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):
Development
Setting Up Tests
The tests use real API calls to ensure accurate responses. To run the tests:
- Copy the example environment file:Copy
- Edit
.env
and add your WolframAlpha API key:Note: TheCopy.env
file is gitignored to prevent committing sensitive information. - Run the tests:Copy
Building
License
MIT
You must be authenticated.
Enables querying WolframAlpha's LLM API for natural language questions, providing structured and simplified answers optimized for LLM consumption.