WolframAlpha LLM MCP Server
by Garoth
WolframAlpha LLM MCP Server
<img src="assets/wolfram-llm-logo.png" width="256" alt="WolframAlpha LLM MCP Logo" />A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation
<div> <img src="assets/readme-screen-1.png" width="609" alt="WolframAlpha MCP Server Example 1" /><br/><br/> <img src="assets/readme-screen-2.png" width="609" alt="WolframAlpha MCP Server Example 2" /> </div>Features
- Query WolframAlpha's LLM API with natural language questions
- Answer complicated mathematical questions
- Query facts about science, physics, history, geography, and more
- Get structured responses optimized for LLM consumption
- Support for simplified answers and detailed responses with sections
Available Tools
ask_llm
: Ask WolframAlpha a question and get a structured llm-friendly responseget_simple_answer
: Get a simplified answervalidate_key
: Validate the WolframAlpha API key
Installation
Copy
Configuration
- Get your WolframAlpha API key from developer.wolframalpha.com
- Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):
Copy
Development
Setting Up Tests
The tests use real API calls to ensure accurate responses. To run the tests:
- Copy the example environment file:Copy
- Edit
.env
and add your WolframAlpha API key:Note: TheCopy.env
file is gitignored to prevent committing sensitive information. - Run the tests:Copy
Building
Copy
License
MIT
This server cannot be installed
Enables querying WolframAlpha's LLM API for natural language questions, providing structured and simplified answers optimized for LLM consumption.