# Quick Start Guide
Get your MCP Database System up and running in minutes!
## Prerequisites
- Python 3.8+
- PostgreSQL database
- OpenAI API key (or other LLM provider)
## Installation
1. **Clone or download** the MCP system
2. **Install dependencies**:
```bash
pip install -r requirements.txt
```
## Configuration
### Option 1: Environment Variables (Recommended)
Copy the example environment file:
```bash
cp config/.env.example .env
```
Edit `.env` with your settings:
```bash
# Database
DATABASE_URL=postgresql://username:password@localhost:5432/your_database
# LLM
OPENAI_API_KEY=your_openai_api_key_here
LLM_MODEL=gpt-4
# Server
SERVER_PORT=8000
```
### Option 2: Configuration File
Copy the example config:
```bash
cp config/example.yaml config.yaml
```
Edit `config.yaml` with your settings.
## Quick Start
### 1. Start the HTTP Server
```bash
python http_bridge.py
```
The server will start at `http://localhost:8000`
### 2. Test the API
**Health Check:**
```bash
curl http://localhost:8000/health
```
**Smart Search:**
```bash
curl -X POST http://localhost:8000/api/database/smart-search \
-H "Content-Type: application/json" \
-d '{"question": "How many tables are in the database?"}'
```
### 3. Use from React
```javascript
const response = await fetch('http://localhost:8000/api/database/smart-search', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ question: 'Show me user statistics' })
});
const result = await response.json();
console.log(result.response); // Markdown formatted answer
```
## Basic Usage Examples
### Schema Information
```bash
curl -X POST http://localhost:8000/api/database/schema \
-H "Content-Type: application/json" \
-d '{"include_sample_data": false}'
```
### SQL Execution
```bash
curl -X POST http://localhost:8000/api/database/sql/execute \
-H "Content-Type: application/json" \
-d '{"sql": "SELECT COUNT(*) FROM users", "limit": true}'
```
### Semantic Search
```bash
curl -X POST http://localhost:8000/api/database/semantic/search \
-H "Content-Type: application/json" \
-d '{"query": "user authentication", "limit": 5}'
```
## MCP Protocol Usage
### Start MCP Server
```bash
python -m presentation.mcp_server
```
### Connect MCP Client
```python
from presentation.mcp_server import MCPDatabaseServer
config = {
'database': {'connection_string': 'postgresql://...'},
'llm': {'provider': 'openai', 'api_key': 'sk-...'}
}
server = MCPDatabaseServer(config)
await server.initialize()
# Use MCP tools
result = await server.handle_tool_call('smart_search', {
'question': 'What data do we have?'
})
```
## Python Integration
### Direct Service Usage
```python
from services.smart_search_service import SmartSearchService
from repositories.postgres_repository import PostgresRepository
from services.schema_service import SchemaService
# ... other imports
# Setup
engine = create_engine('postgresql://...')
postgres_repo = PostgresRepository(engine)
schema_service = SchemaService(postgres_repo)
# ... setup other services
smart_search = SmartSearchService(
schema_service, sql_service, semantic_service, synthesis_service
)
# Use
result = await smart_search.search("How many users are active?")
print(result['response'])
```
### Legacy Compatibility
```python
# Existing code works with compatibility wrapper
from llmDatabaseRouter import LLMDatabaseRouter
router = LLMDatabaseRouter(engine)
result = await router.answer_question("Show me the data schema")
```
## Common Use Cases
### 1. Data Discovery
**Question:** "What data is available?"
**Response:** Comprehensive schema overview with table descriptions
### 2. Quick Analytics
**Question:** "How many customers signed up this month?"
**Response:** SQL query execution with formatted results
### 3. Concept Explanation
**Question:** "What is a foreign key?"
**Response:** Semantic search providing educational content
### 4. Hybrid Queries
**Question:** "Show me user data and explain what each field means"
**Response:** SQL results + semantic explanations
## Troubleshooting
### Database Connection Issues
```bash
# Test connection
python -c "
from sqlalchemy import create_engine
engine = create_engine('your_connection_string')
with engine.connect() as conn:
print('Connected successfully!')
"
```
### API Key Issues
```bash
# Test OpenAI API
python -c "
import openai
openai.api_key = 'your_key'
print(openai.Model.list())
"
```
### Server Not Starting
1. Check port availability: `netstat -an | grep 8000`
2. Check logs for error messages
3. Verify all dependencies installed
4. Check configuration file syntax
### No Results from Search
1. Verify database has data
2. Check table permissions
3. Ensure vector extension installed (for semantic search)
4. Check LLM API quotas and limits
## Next Steps
1. **Read the [Architecture Guide](ARCHITECTURE.md)** for detailed understanding
2. **Review [API Documentation](API.md)** for complete endpoint reference
3. **Check [Configuration Guide](CONFIGURATION.md)** for advanced settings
4. **See [Migration Guide](MIGRATION.md)** if upgrading from legacy system
## Production Deployment
For production deployment:
1. **Set environment to production:**
```bash
export MCP_ENVIRONMENT=production
```
2. **Use production database:**
```bash
export DATABASE_URL=postgresql://prod_user:prod_pass@prod_host:5432/prod_db
```
3. **Configure security:**
```bash
export SECRET_KEY=your_long_random_secret_key
```
4. **Start with multiple workers:**
```bash
uvicorn http_bridge:app --host 0.0.0.0 --port 8000 --workers 4
```
## Support
- **Documentation:** Check the `docs/` directory
- **Examples:** See `examples/` directory
- **Issues:** Check existing test files for usage patterns
- **Configuration:** Review `config/` directory for examples
Happy searching! 🚀