Skip to main content
Glama

MCP Mix Server

by mlstudios-ai
README.md2.57 kB
# MCP Server/Client Example The server is a modified implementation following tutorial from <a href="https://medium.com/data-engineering-with-dremio/building-a-basic-mcp-server-with-python-4c34c41031ed">Medium by Alex Merced</a>. The custom client is an implementation from Anthropic quickstart tutorial <a src="https://modelcontextprotocol.io/quickstart/client">Build an MCP Client</a>. # MCP Server Navigate to your project directory, run these commands in the terminal. ``` source .venv/bin/activate uv --directory . run mcp_server/main.py ``` NOTE: There are no outputs from the terminal - it's normal. # MCP Client ## Claude Desktop On MacOS or Linx, add the following entry to <code>~/Library/Application Support/Claude/claude_desktop_config.json</code> For Windows, add the following entry to <code>%APPDATA%\Claude\claude_desktop_config.json</code> ``` { "mcpServers": { "mcp-mix-server": { "command": "uv", "args": [ "--directory", "{ABSOLUTE_PATH}/mcp-mix-server", "run", "mcp_server/main.py" ] } } } ``` ### Verify server registery 1. Click on "Searches and tools" option. <image src="./images/claude-desktop-mcp-server-registry.png"></image> 2. Click on "mcp-mix-server", you should see the listed tools <image src="./images/claude-desktop-mcp-tools.png"></image> 3. Test with the following queries: - “Summarize the CSV file named sample.csv.” - “How many rows are in sample.parquet?” ## Custom MCP client Alternatively, use a custom client in <code>mcp_client/</code> implemented following the Anthropic quickstart tutorial <a src="https://modelcontextprotocol.io/quickstart/client">Build an MCP Client</a>. Create a <code>.env</code> file in the root folder and put your Athropic API access key in there. To obtain the API access key, login your Anthropic account and following instructions. ``` ANTHROPIC_API_KEY=<your_api_access_key> ``` You can always use your own reasoning model. But this repo is very basic to demonstrate how MCP Server/Client works so we stick with Claude. When starting the client, it will automatically start the server in <code>stdio</code> transport mode. This means the client access the server locally, not remotely, which uses SSE transport mode. There is no need run the server script separately. To run the custom client: ``` uv run mcp_client/client.py mcp_server/main.py ``` Test with the following queries: - “Summarize the CSV file named sample.csv.” - “How many rows are in sample.parquet?” - or type "quit" to exit

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mlstudios-ai/mcp-mix-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server