Skip to main content
Glama

create_index

Create a new Elasticsearch index with custom mapping and optional settings configuration to organize searchable data.

Instructions

Create a new Elasticsearch index with optional mapping and settings configuration

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
indexYesName of the new Elasticsearch index to create
mappingYesIndex mapping configuration defining field types and properties
settingsNoOptional index settings for shards, replicas, analysis, etc.

Implementation Reference

  • Core handler implementation for 'create_index' tool. Performs Elasticsearch index creation with mandatory metadata governance checks, special handling for system indices, detailed success/error responses, and comprehensive exception handling for various failure modes.
    @app.tool( description="Create a new Elasticsearch index with optional mapping and settings configuration", tags={"elasticsearch", "create", "index", "mapping"} ) async def create_index( index: Annotated[str, Field(description="Name of the new Elasticsearch index to create")], mapping: Annotated[ Dict[str, Any], Field(description="Index mapping configuration defining field types and properties")], settings: Annotated[Optional[Dict[str, Any]], Field( description="Optional index settings for shards, replicas, analysis, etc.")] = None ) -> str: """Create a new Elasticsearch index with mapping and optional settings.""" try: es = get_es_client() # Special case: Allow creating index_metadata without validation if index == "index_metadata": body = {"mappings": mapping} if settings: body["settings"] = settings result = es.indices.create(index=index, body=body) return (f"✅ Index metadata system initialized successfully!\n\n" + f"📋 **Metadata Index Created**: {index}\n" + f"🔧 **System Status**: Index metadata management now active\n" + f"✅ **Next Steps**:\n" + f" 1. Use 'create_index_metadata' to document your indices\n" + f" 2. Then use 'create_index' to create actual indices\n" + f" 3. Use 'list_indices' to see metadata integration\n\n" + f"🎯 **Benefits Unlocked**:\n" + f" • Index governance and documentation enforcement\n" + f" • Enhanced index listing with descriptions\n" + f" • Proper cleanup workflows for index deletion\n" + f" • Team collaboration through shared index understanding\n\n" + f"📋 **Technical Details**:\n{json.dumps(result, indent=2, ensure_ascii=False)}") # Check if metadata document exists for this index metadata_index = "index_metadata" try: # Search for existing metadata document search_body = { "query": { "term": { "index_name": index } }, "size": 1 } metadata_result = es.search(index=metadata_index, body=search_body) if metadata_result['hits']['total']['value'] == 0: return (f"❌ Index creation blocked - Missing metadata documentation!\n\n" + f"🚨 **MANDATORY: Create Index Metadata First**:\n" + f" 📋 **Required Action**: Before creating index '{index}', you must document it\n" + f" 🔧 **Use This Tool**: Call 'create_index_metadata' tool first\n" + f" 📝 **Required Information**:\n" + f" • Index purpose and description\n" + f" • Data types and content it will store\n" + f" • Usage patterns and access frequency\n" + f" • Retention policies and lifecycle\n" + f" • Related indices and dependencies\n\n" + f"💡 **Workflow**:\n" + f" 1. Call 'create_index_metadata' with index name and description\n" + f" 2. Then call 'create_index' again to create the actual index\n" + f" 3. This ensures proper documentation and governance\n\n" + f"🎯 **Why This Matters**:\n" + f" • Prevents orphaned indices without documentation\n" + f" • Ensures team understands index purpose\n" + f" • Facilitates better index management and cleanup\n" + f" • Provides context for future maintenance") except Exception as metadata_error: # If metadata index doesn't exist, that's also a problem if "index_not_found" in str(metadata_error).lower(): return (f"❌ Index creation blocked - Metadata system not initialized!\n\n" + f"🚨 **SETUP REQUIRED**: Index metadata system needs initialization\n" + f" 📋 **Step 1**: Create metadata index first using 'create_index' with name 'index_metadata'\n" + f" 📝 **Step 2**: Use this mapping for metadata index:\n" + f"```json\n" + f"{{\n" + f" \"properties\": {{\n" + f" \"index_name\": {{\"type\": \"keyword\"}},\n" + f" \"description\": {{\"type\": \"text\"}},\n" + f" \"purpose\": {{\"type\": \"text\"}},\n" + f" \"data_types\": {{\"type\": \"keyword\"}},\n" + f" \"created_by\": {{\"type\": \"keyword\"}},\n" + f" \"created_date\": {{\"type\": \"date\"}},\n" + f" \"usage_pattern\": {{\"type\": \"keyword\"}},\n" + f" \"retention_policy\": {{\"type\": \"text\"}},\n" + f" \"related_indices\": {{\"type\": \"keyword\"}},\n" + f" \"tags\": {{\"type\": \"keyword\"}}\n" + f" }}\n" + f"}}\n" + f"```\n" + f" 🔧 **Step 3**: Then use 'create_index_metadata' to document your index\n" + f" ✅ **Step 4**: Finally create your actual index\n\n" + f"💡 **This is a one-time setup** - once metadata index exists, normal workflow applies") # If we get here, metadata exists - proceed with index creation body = {"mappings": mapping} if settings: body["settings"] = settings result = es.indices.create(index=index, body=body) return f"✅ Index '{index}' created successfully:\n\n{json.dumps(result, indent=2, ensure_ascii=False)}" except Exception as e: # Provide detailed error messages for different types of Elasticsearch errors error_message = "❌ Failed to create index:\n\n" error_str = str(e).lower() if "connection" in error_str or "refused" in error_str: error_message += "🔌 **Connection Error**: Cannot connect to Elasticsearch server\n" error_message += f"📍 Check if Elasticsearch is running at the configured address\n" error_message += f"💡 Try: Use 'setup_elasticsearch' tool to start Elasticsearch\n\n" elif "already exists" in error_str or "resource_already_exists" in error_str: error_message += f"📁 **Index Exists**: Index '{index}' already exists\n" error_message += f"📍 Cannot create an index that already exists\n" error_message += f"💡 Try: Use 'delete_index' first, or choose a different name\n\n" elif "mapping" in error_str or "invalid" in error_str: error_message += f"📝 **Mapping Error**: Invalid index mapping or settings\n" error_message += f"📍 The provided mapping/settings are not valid\n" error_message += f"💡 Try: Check mapping syntax and field types\n\n" elif "permission" in error_str or "forbidden" in error_str: error_message += "🔒 **Permission Error**: Not allowed to create index\n" error_message += f"📍 Insufficient permissions for index creation\n" error_message += f"💡 Try: Check Elasticsearch security settings\n\n" else: error_message += f"⚠️ **Unknown Error**: {str(e)}\n\n" error_message += f"🔍 **Technical Details**: {str(e)}" return error_message
  • Pydantic-based input schema for the create_index tool, defining parameters index (str), mapping (Dict), settings (Optional Dict) with descriptions via Field.
    async def create_index( index: Annotated[str, Field(description="Name of the new Elasticsearch index to create")], mapping: Annotated[ Dict[str, Any], Field(description="Index mapping configuration defining field types and properties")], settings: Annotated[Optional[Dict[str, Any]], Field( description="Optional index settings for shards, replicas, analysis, etc.")] = None ) -> str:
  • Mounts the elasticsearch_index sub-server app (containing create_index) into the unified Elasticsearch server app.
    from .sub_servers.elasticsearch_index import app as index_app from .sub_servers.elasticsearch_search import app as search_app from .sub_servers.elasticsearch_batch import app as batch_app # Create unified FastMCP application app = FastMCP( name="AgentKnowledgeMCP-Elasticsearch", version="2.0.0", instructions="Unified Elasticsearch tools for comprehensive knowledge management via modular server mounting" ) # ================================ # SERVER MOUNTING - MODULAR ARCHITECTURE # ================================ print("🏗️ Mounting Elasticsearch sub-servers...") # Mount all sub-servers into unified interface app.mount(snapshots_app) # 3 tools: snapshot management app.mount(index_metadata_app) # 3 tools: metadata governance app.mount(document_app) # 3 tools: document operations app.mount(index_app) # 3 tools: index management
  • Mounts the unified elasticsearch_server_app (including create_index via sub-mount) into the main AgentKnowledgeMCP server.
    from src.elasticsearch.elasticsearch_server import app as elasticsearch_server_app from src.prompts.prompt_server import app as prompt_server_app # Import middleware from src.middleware.confirmation_middleware import ConfirmationMiddleware # Load configuration and initialize components CONFIG = load_config() init_security(CONFIG["security"]["allowed_base_directory"]) # Initialize confirmation manager confirmation_manager = initialize_confirmation_manager(CONFIG) print(f"✅ Confirmation system initialized (enabled: {CONFIG.get('confirmation', {}).get('enabled', True)})") # Auto-setup Elasticsearch if needed print("🔍 Checking Elasticsearch configuration...") config_path = Path(__file__).parent / "config.json" setup_result = auto_setup_elasticsearch(config_path, CONFIG) if setup_result["status"] == "setup_completed": # Reload config after setup CONFIG = load_config() print("✅ Elasticsearch auto-setup completed") elif setup_result["status"] == "already_configured": print("✅ Elasticsearch already configured") elif setup_result["status"] == "setup_failed": print(f"⚠️ Elasticsearch auto-setup failed: {setup_result.get('error', 'Unknown error')}") print("📝 You can manually setup using the 'setup_elasticsearch' tool") init_elasticsearch(CONFIG) # Create main FastMCP server app = FastMCP( name=CONFIG["server"]["name"], version=CONFIG["server"]["version"], instructions="🏗️ AgentKnowledgeMCP - Modern FastMCP server with modular composition architecture for knowledge management, Elasticsearch operations, file management, and system administration" ) # ================================ # MIDDLEWARE CONFIGURATION # ================================ print("🔒 Adding confirmation middleware...") # Add confirmation middleware to main server app.add_middleware(ConfirmationMiddleware()) print("✅ Confirmation middleware added successfully!") # ================================ # SERVER COMPOSITION - MOUNTING # ================================ print("🏗️ Mounting individual servers into main server...") # Mount Elasticsearch server with 'es' prefix # This provides: es_search, es_index_document, es_create_index, etc. app.mount(elasticsearch_server_app)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itshare4u/AgentKnowledgeMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server