# Test Suite Overview - MCP Wikipedia Server
Comprehensive testing framework for the MCP Wikipedia Server project.
## π Test Suite Summary
| Test Type | File | Purpose | Execution Time |
|-----------|------|---------|----------------|
| **Unit Tests** | `test_server.py` | Individual function and component testing | ~30s |
| **Integration Tests** | `test_integration.py` | End-to-end workflow testing | ~60s |
| **Performance Tests** | `test_performance.py` | Response time and load testing | ~90s |
| **MCP Compliance** | `test_mcp_compliance.py` | Protocol specification validation | ~45s |
| **Quick Validation** | `quick_test.py` | Fast sanity check | ~10s |
## π Quick Start
### Run All Tests
```bash
# Comprehensive test suite
python tests/run_tests.py
# Quick validation only
python tests/quick_test.py
# Specific test types
python tests/run_tests.py --unit # Unit tests only
python tests/run_tests.py --integration # Integration tests only
python tests/run_tests.py --performance # Performance tests only
python tests/run_tests.py --mcp # MCP compliance only
python tests/run_tests.py --quick # Quick tests only
```
### Using pytest (Advanced)
```bash
# Install test dependencies
pip install -r requirements-test.txt
# Run with pytest
python -m pytest tests/test_server.py -v
# Run with coverage
python -m pytest tests/test_server.py --cov=src --cov-report=html
```
## π Test Coverage
### Unit Tests (`test_server.py`)
- β
**WikipediaServer Class**: Initialization and basic functionality
- β
**fetch_wikipedia_info**: Search functionality, error handling, edge cases
- β
**list_wikipedia_sections**: Section extraction, validation
- β
**get_section_content**: Content retrieval, error scenarios
- β
**Performance**: Response time validation
- β
**Error Handling**: Network errors, invalid inputs, malformed data
- β
**Utilities**: Helper functions and metadata consistency
**Coverage Target**: 95%+ code coverage
### Integration Tests (`test_integration.py`)
- β
**Complete Workflows**: Multi-step research processes
- β
**Error Recovery**: Graceful failure handling
- β
**Data Consistency**: Repeated request validation
- β
**Load Testing**: Concurrent request handling
- β
**Real-world Scenarios**: Typical usage patterns
**Success Criteria**: 80%+ workflow success rate
### Performance Tests (`test_performance.py`)
- β
**Response Times**: Individual tool benchmarks
- β
**Throughput**: Requests per minute validation
- β
**Concurrency**: Parallel request handling
- β
**Load Testing**: Stress testing under simulated load
- β
**Memory Usage**: Resource consumption monitoring
**Performance Targets**:
- Average response time: <2s
- Concurrent requests: 10+ simultaneous
- Success rate: >95%
### MCP Compliance Tests (`test_mcp_compliance.py`)
- β
**Protocol Handshake**: Initialization sequence
- β
**Tool Discovery**: tools/list implementation
- β
**Tool Execution**: tools/call functionality
- β
**Error Handling**: Invalid request rejection
- β
**Message Format**: JSON-RPC 2.0 compliance
- β
**Communication**: stdin/stdout transport
**Compliance Target**: 100% MCP specification adherence
## π οΈ Test Architecture
### Test Organization
```
tests/
βββ test_server.py # Unit tests (pytest-based)
βββ test_integration.py # Integration tests (async)
βββ test_performance.py # Performance benchmarks (async)
βββ test_mcp_compliance.py # MCP protocol tests (async)
βββ quick_test.py # Fast validation script
βββ run_tests.py # Unified test runner
βββ README.md # This documentation
```
### Test Dependencies
```bash
# Core testing
pytest>=7.0.0
pytest-asyncio>=0.21.0
pytest-cov>=4.0.0
# Performance testing
memory-profiler>=0.60.0
psutil>=5.9.0
# See requirements-test.txt for complete list
```
### Configuration Files
- **pytest.ini**: pytest configuration and markers
- **requirements-test.txt**: Testing dependencies
## π Test Execution Matrix
| Environment | Unit | Integration | Performance | MCP | Status |
|-------------|------|-------------|-------------|-----|--------|
| **Python 3.11** | β
| β
| β
| β
| Primary |
| **Python 3.12** | β
| β
| β
| β
| Supported |
| **macOS** | β
| β
| β
| β
| Tested |
| **Linux** | β
| β
| β
| β
| Expected |
| **Windows** | β οΈ | β οΈ | β οΈ | β οΈ | Untested |
## π§ Test Configuration
### Environment Variables
```bash
# Test behavior
export TESTING=true
export TEST_TIMEOUT=30
export LOG_LEVEL=DEBUG
# Wikipedia API
export WIKIPEDIA_TIMEOUT=10
export WIKIPEDIA_RETRIES=3
```
### pytest Markers
```bash
# Run specific test categories
pytest -m unit # Unit tests only
pytest -m integration # Integration tests only
pytest -m performance # Performance tests only
pytest -m "not slow" # Skip slow tests
```
## π Continuous Integration
### GitHub Actions (Recommended)
```yaml
name: Test Suite
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.11, 3.12]
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
pip install -r requirements-test.txt
pip install wikipedia mcp fastmcp
- name: Run tests
run: python tests/run_tests.py
```
### Local CI Simulation
```bash
# Run tests like CI would
python tests/run_tests.py --all
# Quick pre-commit check
python tests/quick_test.py
```
## π― Quality Gates
### Pre-commit Requirements
- β
All unit tests pass
- β
Quick test validation passes
- β
No critical performance regressions
### Pre-release Requirements
- β
All test suites pass (unit, integration, performance, MCP)
- β
Code coverage >90%
- β
Performance benchmarks within targets
- β
MCP compliance 100%
### Production Deployment Requirements
- β
Full test suite passes on target environment
- β
Load testing validates capacity
- β
Integration tests confirm external API functionality
## π Debugging Test Failures
### Common Issues and Solutions
**Import Errors**
```bash
# Fix Python path issues
export PYTHONPATH="${PYTHONPATH}:$(pwd)/src"
python -m pytest tests/
```
**Async Test Issues**
```bash
# Install pytest-asyncio
pip install pytest-asyncio
# Verify asyncio mode in pytest.ini
grep "asyncio_mode = auto" pytest.ini
```
**Wikipedia API Timeouts**
```bash
# Increase timeout for slow connections
export WIKIPEDIA_TIMEOUT=30
python tests/run_tests.py --unit
```
**MCP Compliance Failures**
```bash
# Test server startup manually
python src/mcp_server/mcp_server.py --test
# Check MCP dependencies
python -c "import mcp, fastmcp; print('MCP modules OK')"
```
### Debug Mode
```bash
# Verbose output
python tests/run_tests.py --unit -v
# Debug specific test
python -m pytest tests/test_server.py::TestFetchWikipediaInfo::test_fetch_info_success -v -s
```
## π Test Metrics and Reporting
### Coverage Reports
```bash
# Generate HTML coverage report
python -m pytest tests/test_server.py --cov=src --cov-report=html
open htmlcov/index.html
# Terminal coverage report
python -m pytest tests/test_server.py --cov=src --cov-report=term-missing
```
### Performance Reports
```bash
# Run performance benchmarks
python tests/test_performance.py
# Monitor resource usage
python -c "
import psutil
import time
print(f'Memory: {psutil.virtual_memory().percent}%')
print(f'CPU: {psutil.cpu_percent(interval=1)}%')
"
```
## π€ Contributing Tests
### Adding New Tests
1. **Unit Tests**: Add to `test_server.py` following existing patterns
2. **Integration Tests**: Add workflows to `test_integration.py`
3. **Performance Tests**: Add benchmarks to `test_performance.py`
4. **MCP Tests**: Add protocol tests to `test_mcp_compliance.py`
### Test Writing Guidelines
- Use descriptive test names: `test_fetch_info_handles_empty_query`
- Include both positive and negative test cases
- Test edge cases and error conditions
- Use appropriate pytest markers
- Add docstrings explaining test purpose
- Mock external dependencies when appropriate
### Test Review Checklist
- [ ] Tests cover new functionality
- [ ] Both success and failure cases tested
- [ ] Performance impact considered
- [ ] Documentation updated
- [ ] CI pipeline passes
---
## π Support
For test-related issues:
1. **Check this documentation** for common solutions
2. **Run quick_test.py** to validate basic functionality
3. **Check test logs** for detailed error information
4. **Review test configuration** (pytest.ini, requirements-test.txt)
5. **Create GitHub issue** with test failure details
## π Success Criteria
β
**Production Ready**: All test suites pass consistently
β
**Performance Validated**: Response times within targets
β
**Protocol Compliant**: 100% MCP specification adherence
β
**Error Resilient**: Graceful handling of all failure modes
β
**Well Documented**: Complete test coverage and documentation
Your MCP Wikipedia Server now has **enterprise-grade testing** that ensures reliability, performance, and compliance! π