Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Weather Demowhat's the forecast for New York this weekend?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
mcp
A demo of the ModelContextProtocol (MCP) using the Anthropic AI SDK.
Built following the Server Quickstart and the Client Quickstart.
Getting Started
Prerequisites
Bun (v1.2.17)
asdf (optional)
asdf-bun (optional)
Claude Desktop (required if only running the server)
Installation
If using asdf, run:
asdf installRegardless of whether using asdf or not, install dependencies:
bun installRelated MCP server: QGISMCP
Server
The server allows a client to access resources, tools, and prompts.
It needs a client to interact with an LLM. Claude Desktop serves as the client if you are only running the server.
Set Up
You need to add the server to Claude Desktop by modifying the claude_desktop_config.json file in your Library/Application Support/Claude directory. If this file does not exist, you can create it.
vi ~/.config/Claude/claude_desktop_config.jsonAdd the following to the file:
{
"mcpServers": {
"weather": {
"command": "/ABSOLUTE/PATH/TO/bin/bun",
"args": ["/ABSOLUTE/PATH/TO/src/server/index.ts"]
}
}
}⚠️ Bun Path
If you are using asdf, you will need to use the absolute path to the bun executable. You can find this by running asdf where bun.
If you're just using bun without asdf, you can use bun as the command.
Run
Once you've modified the claude_desktop_config.json file, restart Claude Desktop.
You should now see the weather tools and prompts in Claude Desktop!
Client
Instead of using Claude Desktop, you can also run a client to handle the interaction with the LLM.
This would suitable for building a chat interface or web application that uses Anthropic's API. With MCP, you can give the LLM access to data without having to manually copy and paste them into a prompt.
Set Up
Get an Anthropic API key from the Anthropic API Keys page.
Create a .env file in the root of the project and add the following:
ANTHROPIC_API_KEY=your_api_key_hereRun
Now you can run the client:
bun run devThis gives you an interactive CLI where you can ask the LLM questions. Note that you have access to the tools defined in the server!
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.