Implements a RESTful API with automatic documentation, providing the foundation for the MCP server with CRUD operations and AI features.
Integrates with Hugging Face's transformers library to provide local AI model functionality, including text embeddings, generation, classification, and sentiment analysis.
Uses NumPy for numerical computing to support AI model operations and data processing tasks.
Supports testing with Postman by providing an OpenAPI specification that can be imported for API testing.
Leverages Pydantic for data validation and schema definition, ensuring proper request/response validation throughout the API.
Built using Python with async/await patterns for high-performance operations and efficient resource management.
Utilizes PyTorch for deep learning capabilities, supporting the AI models for text analysis and similarity search.
Incorporates scikit-learn for machine learning utilities, including the TF-IDF model implementation for traditional text analysis.
Offers interactive API documentation through Swagger UI, allowing users to explore and test the available endpoints.
Simple MCP Data Manager with AI (Python)
A simple Model Context Protocol (MCP) server built with Python, FastAPI, and local AI model integration for managing data stored in a local data folder.
Features
🐍 Python Backend: FastAPI-based REST API with automatic documentation
🔧 MCP Server: Implements the Model Context Protocol for AI tool integration
🤖 Local AI Models: Multiple AI model types running locally on your machine
📊 RESTful API: Full CRUD operations with Pydantic validation
💾 Data Persistence: JSON-based data storage in a local
data
folder🎨 Modern Web Interface: Beautiful, responsive UI with AI features
🔍 Smart Search: AI-powered similarity search and traditional search
📚 Auto-generated Docs: Interactive API documentation with Swagger/ReDoc
⚡ Async Operations: High-performance async/await patterns
AI Model Support
The application supports multiple types of local AI models:
Supported Model Types
Sentence Transformers: For text embeddings and similarity search
Default:
all-MiniLM-L6-v2
(fast and efficient)Others:
all-mpnet-base-v2
,multi-qa-MiniLM-L6-cos-v1
Text Generation: For text completion and generation
Models:
gpt2
,distilgpt2
,microsoft/DialoGPT-medium
Text Classification: For categorizing text
Models:
distilbert-base-uncased-finetuned-sst-2-english
Sentiment Analysis: For analyzing text sentiment
Models:
cardiffnlp/twitter-roberta-base-sentiment-latest
TF-IDF: Traditional text analysis (no external dependencies)
AI Features
Text Analysis: Analyze individual text pieces
Item Analysis: Analyze data items using AI
Similarity Search: Find similar items using embeddings
Smart Search: Combine traditional and AI search
Batch Analysis: Analyze all items at once
Model Switching: Change AI models on the fly
Project Structure
Installation
Prerequisites
Python 3.8 or higher
pip (Python package installer)
Sufficient RAM for AI models (2-4GB recommended)
Setup
Clone or download the project
Install dependencies:
pip install -r requirements.txtRun the FastAPI server:
python run.pyor
python -m uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
Usage
Web Interface
Visit http://localhost:8000
to access the web interface with two main tabs:
📊 Data Management Tab
Create Items: Add new items with name, description, and category
View Items: See all items in a beautiful card layout
Search Items: Traditional text search across all item fields
Edit/Delete Items: Update and remove items
🤖 AI Features Tab
AI Model Control: Change AI model type and name
Text Analysis: Analyze individual text pieces
AI-Powered Search: Find similar items using embeddings
Smart Search: Combine traditional and AI search results
Batch Analysis: Analyze all items using AI
REST API Endpoints
Base URL: http://localhost:8000/api
Data Management Endpoints
Method | Endpoint | Description |
GET |
| Get all items |
GET |
| Get item by ID |
POST |
| Create new item |
PUT |
| Update item |
DELETE |
| Delete item |
GET |
| Search items |
GET |
| Health check |
AI Model Endpoints
Method | Endpoint | Description |
GET |
| Get AI model information |
POST |
| Change AI model |
POST |
| Analyze text |
GET |
| Analyze all items |
GET |
| Find similar items |
GET |
| Analyze specific item |
GET |
| Smart search |
API Documentation
Swagger UI:
http://localhost:8000/docs
ReDoc:
http://localhost:8000/redoc
Example API Usage
Create an item:
Analyze text with AI:
Find similar items:
Smart search:
MCP Server
The MCP server provides tools that can be used by AI assistants:
Available Tools
Data Management Tools:
get_all_items
- Retrieve all items from the data storeget_item_by_id
- Get a specific item by its IDcreate_item
- Create a new item with name, description, and categoryupdate_item
- Update an existing item by IDdelete_item
- Delete an item by IDsearch_items
- Search items by query string
AI Model Tools:
7. get_ai_model_info
- Get information about the loaded AI model
8. change_ai_model
- Change the AI model type and name
9. analyze_text
- Analyze text using the AI model
10. analyze_all_items
- Analyze all items using the AI model
11. find_similar_items
- Find items similar to a query using AI embeddings
12. analyze_single_item
- Analyze a specific item using the AI model
13. smart_search
- Smart search combining traditional search with AI similarity
Running the MCP Server
AI Model Configuration
Model Types and Examples
Sentence Transformers (Recommended for similarity search):
model_type = "sentence_transformer" model_name = "all-MiniLM-L6-v2" # Fast and efficientText Generation:
model_type = "text_generation" model_name = "gpt2" # or "distilgpt2"Sentiment Analysis:
model_type = "sentiment_analysis" model_name = "cardiffnlp/twitter-roberta-base-sentiment-latest"Text Classification:
model_type = "text_classification" model_name = "distilbert-base-uncased-finetuned-sst-2-english"TF-IDF (No external dependencies):
model_type = "tfidf" model_name = "TF-IDF"
Model Caching
Models are automatically cached in the models/
directory to avoid re-downloading. The cache directory is created automatically.
Data Structure
Items are stored with the following structure:
API Response Format
All API responses follow a consistent format:
Success Response
Error Response
Development
Project Structure Details
app/models/data_model.py
: Handles all data operations (CRUD)app/schemas/item.py
: Pydantic models for request/response validationapp/api/routes.py
: FastAPI route definitions for data managementapp/api/ai_routes.py
: FastAPI route definitions for AI operationsapp/ai/local_model.py
: AI model manager with multiple model typesapp/main.py
: Main FastAPI application with middlewareapp/mcp_server.py
: MCP server implementation with AI toolsstatic/index.html
: Web interface with AI features
Adding New Features
New API Endpoints: Add routes in
app/api/routes.py
orapp/api/ai_routes.py
Data Model Changes: Modify
app/models/data_model.py
Schema Updates: Update
app/schemas/item.py
AI Model Types: Add new model types in
app/ai/local_model.py
MCP Tools: Add new tools in
app/mcp_server.py
UI Updates: Modify
static/index.html
Testing
You can test the API using:
Web Interface:
http://localhost:8000
Swagger UI:
http://localhost:8000/docs
cURL: Command line examples above
Postman: Import the OpenAPI spec from
/docs
Environment Variables
You can customize the server behavior with environment variables:
Dependencies
Core Dependencies
fastapi: Modern web framework for building APIs
uvicorn: ASGI server for running FastAPI
pydantic: Data validation using Python type annotations
mcp: Model Context Protocol implementation
aiofiles: Async file operations
python-multipart: Form data parsing
AI/ML Dependencies
transformers: Hugging Face transformers library
torch: PyTorch for deep learning
sentence-transformers: Sentence embeddings
numpy: Numerical computing
scikit-learn: Machine learning utilities
Development Dependencies
python-json-logger: Structured logging
Performance Features
Async/Await: All database and AI operations are asynchronous
Pydantic Validation: Automatic request/response validation
CORS Support: Cross-origin resource sharing enabled
Static File Serving: Efficient static file delivery
JSON Storage: Simple, fast file-based storage
Model Caching: AI models are cached locally
Memory Efficient: Models are loaded on-demand
Security Features
Input Validation: Pydantic schemas validate all inputs
CORS Configuration: Configurable cross-origin policies
Error Handling: Proper error responses without data leakage
File Path Safety: Secure file operations with path validation
Local AI: All AI processing happens locally on your machine
Deployment
Local Development
Production
Docker (Optional)
Troubleshooting
Common Issues
Port already in use: Change the port with
--port 8001
Import errors: Ensure you're in the correct directory
Permission errors: Check file permissions for the data directory
MCP connection issues: Verify the MCP server is running correctly
AI model loading errors: Check internet connection for model download
Memory issues: Use smaller models or increase system RAM
AI Model Issues
Model not loading: Check internet connection and model name
Memory errors: Use smaller models like
all-MiniLM-L6-v2
Slow performance: Models are cached after first load
CUDA errors: Models run on CPU by default
Logs
The application provides detailed logging. Check the console output for error messages and debugging information.
Contributing
Fork the repository
Create a feature branch
Make your changes
Add tests if applicable
Submit a pull request
License
MIT License - feel free to use this project for your own purposes.
Support
If you encounter any issues or have questions:
Check the API documentation at
/docs
Review the logs for error messages
Verify AI dependencies are installed
Open an issue on the repository
Roadmap
Database integration (PostgreSQL, SQLite)
Authentication and authorization
File upload support
Real-time updates with WebSockets
Docker containerization
Unit and integration tests
CI/CD pipeline
More AI model types (image analysis, audio processing)
Model fine-tuning capabilities
Batch processing for large datasets
This server cannot be installed
A Python-based Model Context Protocol server that integrates local AI models for managing data with features like CRUD operations, similarity search, and text analysis.
Related MCP Servers
- -securityFlicense-qualityA Model Context Protocol server that enables AI assistants like Claude to perform Python development tasks through file operations, code analysis, project management, and safe code execution.Last updated -5
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI models to interact with MySQL databases, providing tools for querying, executing statements, listing tables, and describing table structures.Last updated -5278MIT License
- -securityAlicense-qualityA streamlined foundation for building Model Context Protocol servers in Python, designed to make AI-assisted development of MCP tools easier and more efficient.Last updated -14MIT License