mcp-server-demo
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@mcp-server-demowhat's the weather in Tokyo?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server Demo (Python)
Quickstart (local run)
Create and activate a virtual environment.
python -m venv .venv
source .venv/bin/activateInstall Python dependencies.
pip install -U pip
pip install -r requirements.txt
pip install -e .Set up environment variables.
cp .env.example .envFill in your OPENAI_API_KEY and keep MCP_DEMO_DB_PATH=demo.db. If you want to use file tools, set MCP_FILE_OPS_ROOT to a local folder path that the server is allowed to manage.
You will fill out the MCP_SERVER_URL in a later step
If you plan to expose your local MCP server publicly, make sure ngrok is installed
sudo snap install ngrokand your account is set up and authenticated first. You can set up an account for free. Once set up, add token using terminalngrok config add-authtoken <AUTHTOKEN>You can find the token onhttps://dashboard.ngrok.com/get-started/your-authtokenStart the MCP server (HTTP transport).
MCP_TRANSPORT=streamable-http MCP_HOST=0.0.0.0 MCP_PORT=8000 MCP_PATH=/mcp mcp-server-demoIn a new terminal, start ngrok.
ngrok http 8000Once you have ngrok running, you need to update MCP_SERVER_URL in your .env file
use the address from 'Forwarding'. Your .env will now look like MCP_SERVER_URL=<Full forwarding Address>/mcp
(Optional) In another terminal, run MCP Inspector.
npx @modelcontextprotocol/inspectorIt should open up a web browser. In the left hand panel update Command to be mcp-server-demo
Press connect. You can select "tools" in the top menu to test the available tools and see history and notifications at the bottom of the page
(Optional) Launch the Streamlit client.
streamlit run web_client.pyA from-scratch local MCP server with tools for weather, SQLite reads, and local file operations:
weather(city)→ current weather via wttr.inquery_db(sql)→ read-only SQLite SELECT querymake_directory(path)→ create directories insideMCP_FILE_OPS_ROOTmove_file(source_path, destination_path)→ move files insideMCP_FILE_OPS_ROOTmove_files_by_glob(source_dir, pattern, destination_dir)→ move many files in one call (e.g.,*.txt)list_files(path=".")→ list files in a folder insideMCP_FILE_OPS_ROOTlist_directories(path=".")→ list directories in a folder insideMCP_FILE_OPS_ROOTread_file(path)→ read text files insideMCP_FILE_OPS_ROOTinspect_file(path, preview_chars=4000, include_base64=False)→ metadata + preview for text/csv/image filesanalyze_image_with_openai(path, prompt, model='gpt-4.1-mini')→ send image to OpenAI vision-capable model
Notes
Default local MCP endpoint is:
http://127.0.0.1:8000/mcpThe server creates
demo.dbautomatically with sample rows.npxrequires Node.js/npm installed locally.streamlitis included inrequirements.txt.
OpenAI API integration option
Ensure
.envincludes your key and MCP server URL.Start server in HTTP mode.
Run:
python client_openai_api.pyTool behavior
weather(city: str)
Returns JSON summary fields including temperature, feels-like, humidity, wind, and short conditions.
query_db(sql: str)
Allows only
SELECT ...queries.Returns rows as JSON.
Rejects non-SELECT SQL for safety in this starter demo.
Project files
server.py— FastMCP server + tool definitions.client_openai_api.py— simple OpenAI API call that can invoke MCP tools.web_client.py— Streamlit chat client.pyproject.toml— package metadata + script entrypoint.requirements.txt— pinned runtime dependencies for local setup.
File operation tools
All file operations are constrained to
MCP_FILE_OPS_ROOT.The server rejects paths that try to escape that root.
MCP_FILE_OPS_ROOTdirectories are created automatically if they do not exist.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/penguinszp001/mcp-server-demo'
If you have feedback or need assistance with the MCP directory API, please join our Discord server