MCP_WolframAlpha

MIT License
9
  • Linux
  • Apple
Integrations
  • Supports integration with Google's Gemini model via LangChain to create an example client that interacts with the MCP server for real-time Wolfram Alpha queries

  • Utilizes LangChain to connect large language models (specifically Gemini) to the Wolfram Alpha API, facilitating the creation of AI clients that can interact with the MCP server

  • Allows chat applications to perform computational queries and retrieve structured knowledge through the Wolfram Alpha API, enabling advanced mathematical, scientific, and data analysis capabilities

MCP Wolfram Alpha (Server + Client)

Seamlessly integrate Wolfram Alpha into your chat applications.

This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.

Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.


Features

  • Wolfram|Alpha Integration for math, science, and data queries.
  • Modular Architecture Easily extendable to support additional APIs and functionalities.
  • Multi-Client Support Seamlessly handle interactions from multiple clients or interfaces.
  • MCP-Client example using Gemini (via LangChain).
  • UI Support using Gradio for a user-friendly web interface to interact with Google AI and Wolfram Alpha MCP server.

Installation

Clone the Repo

git clone https://github.com/ricocf/mcp-wolframalpha.git cd mcp-wolframalpha

Set Up Environment Variables

Create a .env file based on the example:

  • WOLFRAM_API_KEY=your_wolframalpha_appid
  • GeminiAPI=your_google_gemini_api_key (Optional if using Client method below.)

Install Requirements

pip install -r requirements.txt

Configuration

To use with the VSCode MCP Server:

  1. Create a configuration file at .vscode/mcp.json in your project root.
  2. Use the example provided in configs/vscode_mcp.json as a template.
  3. For more details, refer to the VSCode MCP Server Guide.

To use with Claude Desktop:

{ "mcpServers": { "WolframAlphaServer": { "command": "python3", "args": [ "/path/to/src/core/server.py" ] } } }

Client Usage Example

This project includes an LLM client that communicates with the MCP server.

Run with Gradio UI
  • Required: GeminiAPI
  • Provides a local web interface to interact with Google AI and Wolfram Alpha.
  • To run the client directly from the command line:
python main.py --ui
Docker

To build and run the client inside a Docker container:

docker build -t wolframalphaui -f .devops/ui.Dockerfile . docker run wolframalphaui
UI

Run as CLI Tool
  • Required: GeminiAPI
  • To run the client directly from the command line:
python main.py
Docker

To build and run the client inside a Docker container:

docker build -t wolframalpha -f .devops/llm.Dockerfile . docker run -it wolframalpha
-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

MCP_WolframAlpha

  1. Features
    1. Installation
      1. Clone the Repo
      2. Set Up Environment Variables
      3. Install Requirements
      4. Configuration
    2. Client Usage Example
      1. Run with Gradio UI
      2. Docker
      3. UI
      4. Run as CLI Tool
      5. Docker

    Related MCP Servers

    View all related MCP servers

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/ricocf/mcp-wolframalpha'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server