MCP Streamable HTTP Demo
Allows LangChain agents to use MCP tools (calculate, text_stats) over Streamable HTTP for calculations and text analysis.
Integrates with LangGraph to enable agents to leverage MCP tools for arithmetic and text statistics.
Provides an n8n AI Agent setup using MCP over Streamable HTTP, allowing workflows to call the calculate and text_stats tools.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Streamable HTTP DemoCalculate 5+7 and give text stats for 'hello world'"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
TP5 MCP Streamable HTTP Demo
This project recreates the demo objective:
MCP server exposing two tools.
Test with
@modelcontextprotocol/inspector.LangChain/LangGraph-compatible agent using MCP over Streamable HTTP.
n8n AI Agent setup using MCP over Streamable HTTP.
Important security note
The OpenAI key pasted in the prompt was exposed in chat. Revoke it and create a new key before running the agent. Put the new key in .env; do not commit it.
Setup
Copy-Item .env.example .env
npm installEdit .env and set:
OPENAI_API_KEY=your_new_keyStart the MCP server
npm run serverThe MCP endpoint is:
http://127.0.0.1:3000/mcpThe health endpoint is:
http://127.0.0.1:3000/healthTest with MCP Inspector
In one terminal, keep the server running:
npm run serverIn another terminal, list the tools with the Inspector CLI:
npm run inspector:list-toolsOn Windows, the current Inspector CLI can print the correct JSON response and then exit with a Node/libuv assertion. If you see the tools JSON containing calculate and text_stats, the MCP call itself succeeded.
Call a tool with the Inspector CLI:
npx --yes @modelcontextprotocol/inspector --cli http://127.0.0.1:3000/mcp --transport http --method tools/call --tool-name calculate --tool-arg operation=add --tool-arg "numbers=[2,3,4]"You can also open the Inspector UI:
npm run inspectorThen select:
Transport: Streamable HTTP
URL: http://127.0.0.1:3000/mcpLocal smoke test
npm run smokeThis lists the MCP tools and calls calculate.
LangChain agent
Make sure the server is running and .env contains a valid rotated OPENAI_API_KEY.
npm run agentCustom prompt:
npm run agent -- "Calcule 42 / 6 puis analyse le texte: Bonjour depuis MCP."n8n agent
Follow docs/n8n-agent.md.
The n8n MCP Client Tool configuration is:
Endpoint: http://127.0.0.1:3000/mcp
Server Transport: HTTP Streamable
Authentication: None
Tools to Include: AllThis server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/flamekk/Model-Context-Protocol'
If you have feedback or need assistance with the MCP directory API, please join our Discord server