Enterprise MCP Template
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Enterprise MCP TemplateShow me the details for customer record #10425"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Enterprise MCP Template
A production-ready template for building enterprise-grade MCP (Model Context Protocol) servers with OAuth 2.0 authentication, based on battle-tested patterns from the Luxsant NetSuite MCP project.
What is MCP? MCP is a standard protocol that lets AI assistants (Claude, Copilot, etc.) call "tools" (functions) on remote servers. Think of it as a standardized API that AI models know how to use.
Table of Contents
Quick Start
1. Clone and rename
git clone https://github.com/YOUR_USER/enterprise-mcp-template.git my-cool-mcp
cd my-cool-mcp2. Rename the package
# Rename the source directory
mv src/my_mcp_server src/my_cool_mcp
# Find and replace all occurrences:
# "my_mcp_server" -> "my_cool_mcp"
# "my-mcp-server" -> "my-cool-mcp"
# "{{PROJECT_NAME}}" -> "My Cool MCP"
# "{{AUTHOR}}" -> "Your Name"3. Configure environment
cp .env.example .env
# Edit .env with your upstream API credentials4. Install and run
# Create virtual environment
python -m venv venv
source venv/bin/activate # Linux/Mac
# or: venv\Scripts\activate # Windows
# Install dependencies
pip install -e ".[dev]"
# Run locally (stdio mode for Claude Desktop)
python -m my_cool_mcp
# Run as HTTP server
python -m my_cool_mcp http
# Run tests
pytest5. Deploy
# Docker build
docker compose up --build
# Or deploy to Azure Web App
az webapp up --name my-cool-mcp --runtime PYTHON:3.11Architecture Overview
AI Client (Claude Desktop / VS Code / Custom)
|
| MCP Protocol (stdio / SSE / HTTP)
|
+---v----------------------------------------------+
| MCP Server (server.py) |
| +--------------------------------------------+ |
| | OAuth 2.0 Proxy (OAuthProxy) | |
| | - Handles user authentication | |
| | - Manages proxy tokens | |
| | - Token exchange with upstream | |
| +--------------------------------------------+ |
| +--------------------------------------------+ |
| | MCP Tools (@mcp.tool() functions) | |
| | - create_record() | |
| | - get_record() | |
| | - update_record() | |
| | - delete_record() | |
| | - execute_query() | |
| +--------------------------------------------+ |
| +--------------------------------------------+ |
| | HTTP Routes (/health, /debug/*) | |
| +--------------------------------------------+ |
+--------------------------------------------------+
|
| HTTPS + Bearer Token
|
+---v----------------------------------------------+
| API Client (api_client.py) |
| - HTTP requests with retry logic |
| - Response parsing |
| - Error handling |
+--------------------------------------------------+
|
| REST API calls
|
+---v----------------------------------------------+
| Upstream Service (NetSuite, Salesforce, etc.) |
+--------------------------------------------------+Module Dependency Flow
__main__.py / wsgi.py
-> server.py (main server, tools, OAuth, routes)
-> api_client.py (HTTP client for upstream API)
-> config.py (environment configuration)
-> models.py (Pydantic data models)
-> exceptions.py (error hierarchy)
-> auth.py (token caching & refresh)
-> config.py
-> exceptions.py
-> utils.py (logging, sanitization, helpers)Project Structure
enterprise-mcp-template/
|-- .env.example # Environment variable template
|-- .gitignore # Git ignore rules
|-- docker-compose.yml # Docker Compose for local dev
|-- Dockerfile # Multi-stage production Docker build
|-- LICENSE # MIT License
|-- main.py # Root smoke test (not the entry point)
|-- pyproject.toml # Python project configuration
|-- README.md # This file
|-- CLAUDE.md # AI agent instructions
|-- requirements.txt # Production dependencies
|-- startup.sh # Azure Web App startup script
|
|-- docs/ # Documentation
| |-- guide.pdf # PDF version of this guide
|
|-- samples/ # Example payloads
| |-- example_payload.json # Sample API request payload
|
|-- src/
| |-- my_mcp_server/ # Main package (RENAME THIS)
| |-- __init__.py # Package init with lazy imports
| |-- __main__.py # CLI entry point (python -m my_mcp_server)
| |-- server.py # *** MAIN FILE *** MCP server + tools + OAuth
| |-- api_client.py # HTTP client for upstream API
| |-- auth.py # Token management (LRU cache + refresh)
| |-- config.py # Environment-based configuration
| |-- models.py # Pydantic data models
| |-- exceptions.py # Exception hierarchy
| |-- utils.py # Utility functions
| |-- wsgi.py # ASGI entry point for production
| |-- static/
| |-- index.html # Browser-friendly status page
|
|-- tests/ # Test suite
|-- __init__.py
|-- test_config.py # Config tests
|-- test_models.py # Model tests
|-- test_auth.py # Auth/token testsHow to Create a New MCP Server
Step 1: Global Find & Replace
Find | Replace With | Example |
| Your package name (snake_case) |
|
| Your package name (kebab-case) |
|
| Display name |
|
| Your name/org |
|
| Your service prefix |
|
| Your API domain |
|
Step 2: Update OAuth Endpoints (server.py)
In _build_auth_provider(), update:
# BEFORE (template):
auth_endpoint = f"https://{account_id}.app.example.com/oauth2/authorize"
token_endpoint = f"https://{account_id}.api.example.com/oauth2/token"
api_scopes = ["api_access"]
# AFTER (example for NetSuite):
auth_endpoint = f"https://{account_id}.app.netsuite.com/app/login/oauth2/authorize.nl"
token_endpoint = f"https://{account_id}.suitetalk.api.netsuite.com/services/rest/auth/oauth2/v1/token"
api_scopes = ["rest_webservices"]Step 3: Update API URL Patterns (config.py, api_client.py)
In config.py UpstreamAPIConfig.build_api_base_url():
# BEFORE:
return f"https://{self.account_id}.api.example.com/v1"
# AFTER (NetSuite):
return f"https://{self.account_id}.suitetalk.api.netsuite.com/services/rest/record/v1"Step 4: Define Your MCP Tools (server.py)
Replace the generic CRUD tools with domain-specific ones:
@mcp.tool()
async def create_customer(
customer_data: Dict[str, Any],
account_id: Optional[str] = None,
) -> Dict[str, Any]:
"""
Create a new customer in Salesforce.
Args:
customer_data: Customer fields (Name, Email, Phone, etc.)
account_id: Salesforce org ID
Returns:
Structured response with the created customer's ID.
"""
token = _get_oauth_token()
async with _get_client(account_id=account_id) as client:
response = await client.create_record(
access_token=token,
record_type="customer",
payload=customer_data,
)
return _serialize_response(response)Step 5: Update Models (models.py)
Replace example models with your domain entities:
class CustomerPayload(BaseModel):
name: str = Field(..., description="Customer name")
email: Optional[str] = Field(default=None)
phone: Optional[str] = Field(default=None)
# ... your fieldsStep 6: Test and Deploy
# Run tests
pytest
# Local HTTP test
python -m your_package http
# Visit http://localhost:8000/health
# Docker
docker compose up --buildOAuth 2.0 Authentication Deep Dive
How OAuth Works in This Template
1. AI Client connects to MCP server
|
2. MCP server redirects user to upstream login page
| (via OAuthProxy)
|
3. User logs in at upstream service (NetSuite, Salesforce, etc.)
|
4. Upstream redirects back with authorization code
| -> https://your-server.com/auth/callback?code=ABC123
|
5. OAuthProxy exchanges code for access token (server-to-server)
| POST to token endpoint with client_id + client_secret
|
6. OAuthProxy stores the real token, gives client a proxy token
|
7. Client sends proxy token with each MCP tool call
|
8. OAuthProxy looks up real token, passes to tool function
|
9. Tool function uses real token to call upstream APICritical OAuth Configuration
auth = OAuthProxy(
# WHERE users log in
upstream_authorization_endpoint=auth_endpoint,
# WHERE we exchange codes for tokens
upstream_token_endpoint=token_endpoint,
# OUR app's credentials
upstream_client_id=client_id,
upstream_client_secret=client_secret,
# HOW we verify proxy tokens
token_verifier=token_verifier,
# PUBLIC URL for callbacks
base_url=base_url,
# HOW we send credentials to token endpoint
# "client_secret_basic" = Authorization header (most APIs)
# "client_secret_post" = POST body parameters
token_endpoint_auth_method="client_secret_basic",
# PKCE handling - CRITICAL!
# Set to False if upstream handles PKCE with browser directly
# Set to True if you need to forward PKCE params
forward_pkce=False,
# OAuth scopes
valid_scopes=api_scopes,
# Accept any MCP client redirect URI
allowed_client_redirect_uris=None,
# Sign proxy JWTs with a stable key (set MCP_JWT_SIGNING_KEY in prod!)
jwt_signing_key=jwt_signing_key,
# Skip our consent screen (upstream has its own)
require_authorization_consent=False,
# In-memory client storage (resets on restart - intentional)
client_storage=client_storage,
)OAuth Gotchas (Lessons Learned)
forward_pkce=False: If your upstream API handles PKCE between itself and the browser, do NOT forward your own PKCE parameters. Your server'scode_verifierwon't match the browser'scode_challenge, causinginvalid_granterrors.required_scopeson DebugTokenVerifier: Without this, clients registered via DCR getscope=""and ALL scope requests are rejected withinvalid_scopebefore reaching the upstream.MCP_JWT_SIGNING_KEY: Without a stable key, the OAuthProxy generates a random key on each startup. Container restarts invalidate ALL proxy tokens. Always set in production.MemoryStorefor client storage: Resets on restart. This is actually GOOD - prevents stale client registrations from previous deployments.token_endpoint_auth_method: Test both "client_secret_basic" and "client_secret_post" using the/debug/token-testendpoint. The wrong method givesinvalid_clientinstead ofinvalid_grant.
Libraries & Dependencies
Library | Version | Purpose | Why This Library |
fastmcp | >=3.0.0b2 | MCP framework | Only production-grade MCP framework. Handles protocol, OAuth, transport. |
httpx | >=0.27.0 | HTTP client | Async HTTP client with connection pooling. Superior to requests for async. |
pydantic | >=2.0.0 | Data validation | Industry standard. Auto-validation, serialization, IDE support. |
pydantic-settings | >=2.1.0 | Settings management | Pydantic extension for env var parsing. |
python-dotenv | >=1.0.0 | .env file loading | Loads .env files for local development. |
loguru | >=0.7.2 | Logging | Enhanced logging (optional, can use stdlib). |
gunicorn | >=21.2.0 | Process manager | Production WSGI/ASGI server. Multi-worker, graceful restarts. |
uvicorn | >=0.27.0 | ASGI server | High-performance async HTTP server. Used as gunicorn worker class. |
Why FastMCP 3.0?
FastMCP 3.0 is the only production-grade MCP framework available. Key features:
Native
host/portsupport in.run()Built-in
OAuthProxyfor OAuth 2.0 authenticationDebugTokenVerifierfor development/testingget_access_token()dependency injectionSupport for three transports: stdio, SSE, HTTP
@mcp.tool()decorator for registering tools@mcp.custom_route()for HTTP endpointsStateless HTTP mode for cloud load balancers
Why httpx over requests?
Async support:
httpx.AsyncClientworks natively withasync/awaitConnection pooling: Reuses TCP connections automatically
Timeout control: Granular timeout settings per request
HTTP/2 support: Optional HTTP/2 for better performance
requests-compatible API: Easy to migrate from requests
Configuration System
All configuration uses environment variables following the 12-Factor App methodology.
Configuration Hierarchy
AppConfig
├── UpstreamAPIConfig (API connection: URL, credentials, timeouts)
├── TokenStoreConfig (Token caching: LRU size, expiry buffer)
└── ServerConfig (Server: name, transport, host, port)Key Environment Variables
Variable | Required | Default | Description |
| Yes* | - | Account/tenant identifier |
| Yes* | - | OAuth client ID |
| Yes* | - | OAuth client secret |
| Yes* | - | Public URL for OAuth callbacks |
| No |
| Transport: stdio/sse/http |
| No |
| Server port |
| No |
| Server host binding |
| No |
| Enable token LRU cache |
| No |
| Refresh buffer (seconds) |
| No | random | Stable JWT key for production |
| No |
| DEBUG/INFO/WARNING/ERROR |
| No |
| Enable debug mode |
*Required for OAuth authentication. Server runs without auth if missing.
Singleton Pattern
from config import get_config, set_config, reset_config
# Normal usage (reads env vars once, caches globally)
config = get_config()
base_url = config.upstream.build_api_base_url()
# Testing (override with custom config)
set_config(AppConfig(server=ServerConfig(port=9999)))
# Reset (force re-read from env)
reset_config()MCP Tools Pattern
Every MCP tool follows this exact pattern:
@mcp.tool()
async def my_tool(
required_param: str,
optional_param: Optional[str] = None,
account_id: Optional[str] = None,
base_url: Optional[str] = None,
) -> Dict[str, Any]:
"""
Tool description (AI reads this to decide when to use the tool).
Args:
required_param: Description for AI
optional_param: Description for AI
account_id: Account ID (if not preconfigured)
base_url: Override API URL
Returns:
Structured response dict with ok, status_code, data, errors.
"""
# 1. Get OAuth token from MCP session
token = _get_oauth_token()
# 2. Create API client (async context manager for cleanup)
async with _get_client(base_url, account_id) as client:
# 3. Call the appropriate client method
response = await client.some_method(
access_token=token,
...
)
# 4. Serialize and return
return _serialize_response(response)Rules for MCP Tools
Return simple Python objects (dict, list, str, number). They're serialized to JSON.
Docstrings matter: AI reads them to decide when/how to use the tool.
Parameter types matter: FastMCP generates JSON Schema from type hints.
Always use
_serialize_response(): Provides consistent response format.Always use
async with: Ensures HTTP client cleanup on error.Add
account_idandbase_urlparams: Lets AI clients specify targets dynamically.
API Client Pattern
The API client (api_client.py) handles all HTTP communication:
async with APIClient(base_url="https://api.example.com/v1") as client:
# Generic CRUD
response = await client.create_record(token, "customer", payload)
response = await client.get_record(token, "customer", "123")
response = await client.update_record(token, "customer", "123", updates)
response = await client.delete_record(token, "customer", "123")
# Query (if your API supports it)
response = await client.execute_query(token, "SELECT * FROM Customer")Retry Logic
Attempt 1: Immediate
Attempt 2: Wait 0.5s (backoff_factor * 2^0)
Attempt 3: Wait 1.0s (backoff_factor * 2^1)
Attempt 4: Wait 2.0s (backoff_factor * 2^2)Retries on: 429, 500, 502, 503, 504, timeouts, connection errors.
Does NOT retry: 400, 401, 403, 404.
Token Management
LRU Token Cache
Token Cache (max 100 entries)
+---------+------------------+-----------+
| Key | Token | Expires |
+---------+------------------+-----------+
| sha256 | eyJhbG... | 1hr | <- Most recently used
| sha256 | eyJxyz... | 45min |
| sha256 | eyJabc... | 30min |
| ... | ... | ... |
| sha256 | eyJold... | 10min | <- Least recently used (evicted first)
+---------+------------------+-----------+Token Lifecycle
1. User authenticates -> access_token + refresh_token
2. Token cached with SHA-256 key
3. On each API call: check if cached token is still valid
4. If expired (with 5-min buffer): attempt refresh
5. If refresh succeeds: cache new token
6. If refresh fails: user must re-authenticateException Hierarchy
MCPServerError (catch-all)
├── ConfigurationError
│ ├── MissingConfigurationError
│ └── InvalidConfigurationError
├── AuthenticationError
│ ├── TokenError
│ │ ├── TokenExpiredError
│ │ ├── TokenRefreshError
│ │ └── TokenValidationError
│ └── InvalidCredentialsError
├── APIError
│ ├── ConnectionError
│ ├── TimeoutError
│ ├── RateLimitError
│ ├── NotFoundError
│ ├── ValidationError
│ ├── PermissionError
│ └── ServerError
└── RecordError
├── RecordNotFoundError
├── RecordValidationError
└── DuplicateRecordErrorEvery exception has to_dict() for JSON serialization and a machine-readable code field.
Deployment Guide
Local Development (stdio)
python -m my_mcp_server
# Communicates via stdin/stdout - used by Claude DesktopLocal HTTP Server
python -m my_mcp_server http
# Available at http://localhost:8000
# Health: http://localhost:8000/health
# MCP: http://localhost:8000/mcpDocker
# Build and run
docker compose up --build
# Or standalone
docker build -t my-mcp .
docker run -p 8000:8000 --env-file .env my-mcpAzure Web App
# Option 1: Container deployment
az webapp create --name my-mcp --plan my-plan --deployment-container-image-name my-mcp:latest
# Option 2: Source deployment
az webapp up --name my-mcp --runtime PYTHON:3.11
# Set environment variables in Azure Portal:
# Settings -> Configuration -> Application settingsRequired Azure settings:
All
UPSTREAM_*env varsMCP_SERVER_BASE_URL=https://my-mcp.azurewebsites.netMCP_TRANSPORT=httpMCP_JWT_SIGNING_KEY=<generate with: python -c "import secrets; print(secrets.token_hex(32))">
Claude Desktop Configuration
Add to claude_desktop_config.json:
{
"mcpServers": {
"my-mcp": {
"url": "https://my-mcp.azurewebsites.net/mcp"
}
}
}Testing
# Run all tests
pytest
# With coverage
pytest --cov=my_mcp_server --cov-report=html
# Specific test file
pytest tests/test_config.py -v
# Run with verbose output
pytest -v -sTest Structure
test_config.py- Environment parsing, config validation, singletontest_models.py- Pydantic model validation, serialization, factoriestest_auth.py- Token caching, expiry checking, LRU eviction
Best Practices & Gotchas
DO
Always use
async withfor API clients - ensures HTTP connection cleanupAlways sanitize sensitive data before logging - use
sanitize_for_logging()Always return
APIResponsefrom tools - consistent interface for AI clientsSet
MCP_JWT_SIGNING_KEYin production - prevents token invalidation on restartLog to stderr, not stdout - stdout is reserved for MCP protocol in stdio mode
Use UTC for all timestamps -
datetime.now(timezone.utc)Add
account_idparameter to tools - lets AI specify targets dynamicallyWrite descriptive docstrings - AI reads them to decide tool usage
Use environment variables for ALL config - never hardcode credentials
DON'T
Don't log raw tokens - use
mask_token()helperDon't hardcode API URLs - use config.py and env vars
Don't catch bare
Exception- use the exception hierarchyDon't use
requestslibrary - usehttpxfor async supportDon't run on stdout in stdio mode - it corrupts MCP protocol
Don't skip the token expiry buffer - tokens can expire mid-request
Don't use
functools.lru_cachefor tokens - need expiry-aware evictionDon't forward PKCE if upstream handles it - causes
invalid_grant
Troubleshooting
OAuth Issues
Visit
/health- shows if OAuth is configured and which env vars are setVisit
/debug/logs?filter=oauth- shows OAuth flow logsVisit
/debug/token-test- tests both auth methods against upstreamVisit
/debug/server-info- shows if container restarted (lost OAuth state)
Common Errors
Error | Cause | Fix |
| PKCE mismatch or expired code | Set |
| Wrong auth method or credentials | Try both auth methods via |
| Missing | Add |
| User not logged in | Connect via MCP client with OAuth support |
Token invalidated on restart | No stable JWT key | Set |
Debug Endpoints
Endpoint | Purpose |
| Server status, config, OAuth info |
| Recent server logs (in-memory buffer) |
| OAuth-specific logs |
| Instance ID, uptime, OAuth state counts |
| Test token exchange with upstream |
License
MIT License - See LICENSE for details.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/victor-velazquez-ai/enterprise-mcp-template'
If you have feedback or need assistance with the MCP directory API, please join our Discord server