Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@one-mcpSearch for any tools related to financial data or stock prices"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
π§ one-mcp
π Overview
one-mcp is a lightweight MCP (Model Context Protocol) server built using FastAPI that enables intelligent tool management and semantic search for APIs.
It allows you to upload, manage, and query API tools using natural language β powered by modern embedding models via sentence-transformers.
The server supports multiple transport modes (stdio, HTTP, or both) and provides both a REST API and MCP tool interface for maximum flexibility.
β¨ Features
π Semantic Search: Find relevant API tools based on descriptive queries using sentence-transformers embeddings.
π€ Upload Tools: Add new API tools via JSON body or file upload.
ποΈ Delete Tools: Remove specific tools by name (supports batch deletion).
π§Ύ Tool Statistics: Get insights on stored tools including count, model, and storage path.
π§Ή Tool Management: Clear, inspect, or modify your tool store easily.
β‘ FastAPI Backend: High-performance, async-ready backend server.
π€ MCP Compatibility: Dual interface - REST API and MCP tools for seamless integration.
π Dual Transport: Support for stdio and HTTP transports simultaneously.
πΎ Persistent Storage: Tools and embeddings saved to disk with automatic loading.
π Structured Logging: Comprehensive logging with rotating file handlers.
π§© Project Structure
one-mcp/
βββ server.py # Main application entry point with server orchestration
βββ mcp_server.py # MCP server class with multi-transport support
βββ api.py # FastAPI routes and REST endpoints
βββ mcp_tools.py # MCP tool definitions and handlers
βββ models.py # Pydantic models for request/response validation
βββ tools_store.py # Persistent tool storage with embeddings
βββ config.py # Server configuration and argument parsing
βββ logging_setup.py # Centralized logging configuration
βββ test_specs.json # Sample tool dataset for testing
βββ CURLS.md # Example cURL commands for testing API endpoints
βββ MCP_TOOLS.md # MCP tools documentation
βββ requirements.txt # Project dependencies
βββ Dockerfile # Docker containerization (CPU-based dependencies)
βββ README.md # Project documentation (this file)βοΈ Installation
1. Clone the Repository
git clone https://github.com/freakynit/one-mcp.git
cd one-mcp2. Set Up Virtual Environment
python -m venv venv
source venv/bin/activate # macOS/Linux
venv\Scripts\activate # Windows3. Install Dependencies
pip install -r requirements.txtDependencies include:
fastapi>=0.104.0
uvicorn>=0.24.0
fastmcp>=0.2.0
python-multipart>=0.0.6
torch==2.4.1
torchvision==0.19.1
torchaudio==2.4.1
sentence-transformers>=2.2.0
scikit-learn>=1.3.0
numpy>=1.24.0π§ Running the Server
Note: The first time you run the server, it will download the
all-MiniLM-L6-v2model from sentence-transformers. This may take a few seconds depending on your internet connection.
Start with Dual Transport (stdio + HTTP)
python server.py --transport stdio,http --port 8003This enables both MCP stdio communication and HTTP REST API access.
HTTP-only Mode
python server.py --transport http --port 8003Stdio-only Mode (for MCP clients)
python server.py --transport stdioUsing Uvicorn Directly
uvicorn server:app --host 0.0.0.0 --port 8003Configuration Options
--transport: Transport mode (stdio, http, or stdio,http) - default: stdio--port: HTTP port number - default: 8000--host: Host to bind to - default: 0.0.0.0--storage_path: Path to store tool embeddings - default: tool_embeddings.json
By default, the server starts at:
π http://localhost:8003 (when HTTP transport is enabled)
The server automatically:
Creates a
logs/directory for application logsLoads existing tools from
tool_embeddings.jsonon startupSaves tools to disk after any modification
π§ͺ Testing the API
The server provides two interfaces:
REST API: Available at
/api/*endpoints (see CURLS.md for examples)MCP Tools: Available via MCP protocol (see MCP_TOOLS.md for documentation)
REST API Endpoints
All endpoints return structured JSON responses with appropriate status codes.
Check Server Status
curl http://localhost:8003/api/statusUpload Tools via JSON
curl -X POST http://localhost:8003/api/tools/upload-json \
-H "Content-Type: application/json" \
-d '{"tools": [{"type": "function", "name": "get_weather", "description": "Get the current weather for a specific city.", "parameters": {"type": "object", "properties": {"city": {"type": "string", "description": "The name of the city to get weather for."}}}}]}'Upload Tools via File
curl -X POST http://localhost:8003/api/tools/upload-file \
-F "file=@test_tools.json;type=application/json"Search for Similar Tools
curl -X POST http://localhost:8003/api/tools/search \
-H "Content-Type: application/json" \
-d '{"query": "weather forecast for a city", "k": 3}'Get Statistics
curl http://localhost:8003/api/tools/statsDelete Specific Tools
curl -X DELETE http://localhost:8003/api/tools/delete \
-H "Content-Type: application/json" \
-d '{"tool_names": ["get_weather", "get_news_headlines"]}'Clear All Tools
curl -X DELETE http://localhost:8003/api/tools/clearMCP Access
The MCP endpoint is mounted at /mcp for HTTP streaming mode:
curl http://localhost:8003/mcpFor full MCP tool documentation, see MCP_TOOLS.md.
For more comprehensive testing examples, see CURLS.md.
π§° Example MCP Configuration
To integrate with an MCP client (like Claude Desktop):
{
"mcpServers": {
"one-mcp-server": {
"command": "python",
"args": [
"/absolute/path/to/server.py",
"--transport", "stdio",
"--storage_path", "tool_embeddings.json"
]
}
}
}For dual transport mode (stdio for MCP + HTTP for REST API):
{
"mcpServers": {
"one-mcp-server": {
"command": "python",
"args": [
"/absolute/path/to/server.py",
"--transport", "stdio,http",
"--port", "8004",
"--storage_path", "tool_embeddings.json"
]
}
}
}ποΈ Architecture
Components
server.py: Entry point that initializes the app and starts the MCP server
mcp_server.py: Handles multi-transport server orchestration (stdio/HTTP/dual)
api.py: FastAPI application factory and REST endpoint definitions
mcp_tools.py: MCP tool decorators and function implementations
tools_store.py: Singleton store for tool embeddings with search capability
models.py: Pydantic models for type safety and validation
config.py: Configuration management and CLI argument parsing
logging_setup.py: Centralized logging with rotating file handlers
How It Works
Tool Storage: Tools are stored with their embeddings using
sentence-transformersSemantic Search: Query embeddings are compared using cosine similarity
Persistence: Tools automatically saved to
tool_embeddings.jsonDual Interface: Same functionality available via REST API and MCP tools
Multi-Transport: Server can run stdio (for MCP clients) and HTTP simultaneously
Dev
Create zip:
zip -r one-mcp.zip . -x "*.git/*" -x ".env" -x ".DS_Store" -x ".dockerignore" -x ".gitignore"
π§βπ» Contributing
Contributions are welcome! To contribute:
Fork the repository
Create a new feature branch (
git checkout -b feature/my-feature)Commit your changes (
git commit -m "Add my feature")Push to your fork (
git push origin feature/my-feature)Submit a Pull Request
Before submitting, ensure:
Code passes linting and basic tests.
Youβve updated documentation if needed.
π License
This project is licensed under the MIT License β see the LICENSE file for details.
π¬ Support
If you encounter any issues or have feature requests:
Open an issue on GitHub
Or contact @freakynit directly.
This server cannot be installed
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.