# ๐ง Demo MCP Server with Smithery
This repository showcases a demo implementation of a **Model Context Protocol (MCP)** server using **Smithery**, designed to help AI agents interact with tools securely and efficiently.
It includes a streamable MCP server that generates a simple Python "Hello World" program and is ready for integration with LibreChat.
---
## ๐ฆ Project Structure
File Structure
Demo-MCP-Server-Smithery/
โโโ safe-mcp-server/ # Secure MCP server using Smithery CLI
โโโ insecure-python-code/ # Example of insecure code for contrast
โโโ smithery.yaml # Smithery configuration file
โโโ README.md # You're reading it!
## ๐ Features
- **Secure MCP Server**: Built with Smithery CLI to ensure safe tool interactions.
- **Insecure Code Samples**: Included for educational contrast and security awareness.
- **LibreChat Integration**: Ready to plug into LibreChat for AI agent communication.
- **Streamable Output**: Generates and streams a basic Python script.
---
## ๐ ๏ธ Requirements
- Node.js and npm
- Smithery CLI installed
- LibreChat (optional for integration)
---
## ๐งช Getting Started
1. Clone the repository:
```bash
git clone https://github.com/Auxin-io/Demo-MCP-Server-Smithery.git
cd Demo-MCP-Server-Smithery
2. Install dependencies (if applicable).
3. Run the secure MCP server:
cd safe-mcp-server
smithery run smithery.yaml
4. Test the output and explore the insecure code for comparison.
๐งโ๐ป Contributing
Pull requests are welcome! Feel free to fork the repo and submit improvements or new features.
๐ License
This project is licensed under the MIT License.
โจ Acknowledgments
Learn More
Smithery Documentation
LibreChat GitHub
๐งโ๐ป Contributing
Pull requests are welcome! Feel free to fork the repo and submit improvements or new features.
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/Auxin-io/Demo-Secure-MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server