Skip to main content
Glama

create_index

Set up a new Elasticsearch index with custom mapping and optional settings to define field types, shards, replicas, and analysis for structured data management.

Instructions

Create a new Elasticsearch index with optional mapping and settings configuration

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
indexYesName of the new Elasticsearch index to create
mappingYesIndex mapping configuration defining field types and properties
settingsNoOptional index settings for shards, replicas, analysis, etc.

Implementation Reference

  • The core handler function for the 'create_index' tool. Defines input schema via Annotated[Field], performs metadata validation before index creation, handles special 'index_metadata' case, integrates with Elasticsearch client, and provides rich formatted responses with guidance.
    @app.tool( description="Create a new Elasticsearch index with optional mapping and settings configuration", tags={"elasticsearch", "create", "index", "mapping"} ) async def create_index( index: Annotated[str, Field(description="Name of the new Elasticsearch index to create")], mapping: Annotated[ Dict[str, Any], Field(description="Index mapping configuration defining field types and properties")], settings: Annotated[Optional[Dict[str, Any]], Field( description="Optional index settings for shards, replicas, analysis, etc.")] = None ) -> str: """Create a new Elasticsearch index with mapping and optional settings.""" try: es = get_es_client() # Special case: Allow creating index_metadata without validation if index == "index_metadata": body = {"mappings": mapping} if settings: body["settings"] = settings result = es.indices.create(index=index, body=body) return (f"βœ… Index metadata system initialized successfully!\n\n" + f"πŸ“‹ **Metadata Index Created**: {index}\n" + f"πŸ”§ **System Status**: Index metadata management now active\n" + f"βœ… **Next Steps**:\n" + f" 1. Use 'create_index_metadata' to document your indices\n" + f" 2. Then use 'create_index' to create actual indices\n" + f" 3. Use 'list_indices' to see metadata integration\n\n" + f"🎯 **Benefits Unlocked**:\n" + f" β€’ Index governance and documentation enforcement\n" + f" β€’ Enhanced index listing with descriptions\n" + f" β€’ Proper cleanup workflows for index deletion\n" + f" β€’ Team collaboration through shared index understanding\n\n" + f"πŸ“‹ **Technical Details**:\n{json.dumps(result, indent=2, ensure_ascii=False)}") # Check if metadata document exists for this index metadata_index = "index_metadata" try: # Search for existing metadata document search_body = { "query": { "term": { "index_name": index } }, "size": 1 } metadata_result = es.search(index=metadata_index, body=search_body) if metadata_result['hits']['total']['value'] == 0: return (f"❌ Index creation blocked - Missing metadata documentation!\n\n" + f"🚨 **MANDATORY: Create Index Metadata First**:\n" + f" πŸ“‹ **Required Action**: Before creating index '{index}', you must document it\n" + f" πŸ”§ **Use This Tool**: Call 'create_index_metadata' tool first\n" + f" πŸ“ **Required Information**:\n" + f" β€’ Index purpose and description\n" + f" β€’ Data types and content it will store\n" + f" β€’ Usage patterns and access frequency\n" + f" β€’ Retention policies and lifecycle\n" + f" β€’ Related indices and dependencies\n\n" + f"πŸ’‘ **Workflow**:\n" + f" 1. Call 'create_index_metadata' with index name and description\n" + f" 2. Then call 'create_index' again to create the actual index\n" + f" 3. This ensures proper documentation and governance\n\n" + f"🎯 **Why This Matters**:\n" + f" β€’ Prevents orphaned indices without documentation\n" + f" β€’ Ensures team understands index purpose\n" + f" β€’ Facilitates better index management and cleanup\n" + f" β€’ Provides context for future maintenance") except Exception as metadata_error: # If metadata index doesn't exist, that's also a problem if "index_not_found" in str(metadata_error).lower(): return (f"❌ Index creation blocked - Metadata system not initialized!\n\n" + f"🚨 **SETUP REQUIRED**: Index metadata system needs initialization\n" + f" πŸ“‹ **Step 1**: Create metadata index first using 'create_index' with name 'index_metadata'\n" + f" πŸ“ **Step 2**: Use this mapping for metadata index:\n" + f"```json\n" + f"{{\n" + f" \"properties\": {{\n" + f" \"index_name\": {{\"type\": \"keyword\"}},\n" + f" \"description\": {{\"type\": \"text\"}},\n" + f" \"purpose\": {{\"type\": \"text\"}},\n" + f" \"data_types\": {{\"type\": \"keyword\"}},\n" + f" \"created_by\": {{\"type\": \"keyword\"}},\n" + f" \"created_date\": {{\"type\": \"date\"}},\n" + f" \"usage_pattern\": {{\"type\": \"keyword\"}},\n" + f" \"retention_policy\": {{\"type\": \"text\"}},\n" + f" \"related_indices\": {{\"type\": \"keyword\"}},\n" + f" \"tags\": {{\"type\": \"keyword\"}}\n" + f" }}\n" + f"}}\n" + f"```\n" + f" πŸ”§ **Step 3**: Then use 'create_index_metadata' to document your index\n" + f" βœ… **Step 4**: Finally create your actual index\n\n" + f"πŸ’‘ **This is a one-time setup** - once metadata index exists, normal workflow applies") # If we get here, metadata exists - proceed with index creation body = {"mappings": mapping} if settings: body["settings"] = settings result = es.indices.create(index=index, body=body) return f"βœ… Index '{index}' created successfully:\n\n{json.dumps(result, indent=2, ensure_ascii=False)}" except Exception as e: # Provide detailed error messages for different types of Elasticsearch errors error_message = "❌ Failed to create index:\n\n" error_str = str(e).lower() if "connection" in error_str or "refused" in error_str: error_message += "πŸ”Œ **Connection Error**: Cannot connect to Elasticsearch server\n" error_message += f"πŸ“ Check if Elasticsearch is running at the configured address\n" error_message += f"πŸ’‘ Try: Use 'setup_elasticsearch' tool to start Elasticsearch\n\n" elif "already exists" in error_str or "resource_already_exists" in error_str: error_message += f"πŸ“ **Index Exists**: Index '{index}' already exists\n" error_message += f"πŸ“ Cannot create an index that already exists\n" error_message += f"πŸ’‘ Try: Use 'delete_index' first, or choose a different name\n\n" elif "mapping" in error_str or "invalid" in error_str: error_message += f"πŸ“ **Mapping Error**: Invalid index mapping or settings\n" error_message += f"πŸ“ The provided mapping/settings are not valid\n" error_message += f"πŸ’‘ Try: Check mapping syntax and field types\n\n" elif "permission" in error_str or "forbidden" in error_str: error_message += "πŸ”’ **Permission Error**: Not allowed to create index\n" error_message += f"πŸ“ Insufficient permissions for index creation\n" error_message += f"πŸ’‘ Try: Check Elasticsearch security settings\n\n" else: error_message += f"⚠️ **Unknown Error**: {str(e)}\n\n" error_message += f"πŸ” **Technical Details**: {str(e)}" return error_message
  • Mounts the elasticsearch_index sub-server app (containing create_index tool) into the unified Elasticsearch server app.
    # Import sub-server applications for mounting from .sub_servers.elasticsearch_snapshots import app as snapshots_app from .sub_servers.elasticsearch_index_metadata import app as index_metadata_app from .sub_servers.elasticsearch_document import app as document_app from .sub_servers.elasticsearch_index import app as index_app from .sub_servers.elasticsearch_search import app as search_app from .sub_servers.elasticsearch_batch import app as batch_app # Create unified FastMCP application app = FastMCP( name="AgentKnowledgeMCP-Elasticsearch", version="2.0.0", instructions="Unified Elasticsearch tools for comprehensive knowledge management via modular server mounting" ) # ================================ # SERVER MOUNTING - MODULAR ARCHITECTURE # ================================ print("πŸ—οΈ Mounting Elasticsearch sub-servers...") # Mount all sub-servers into unified interface app.mount(snapshots_app) # 3 tools: snapshot management app.mount(index_metadata_app) # 3 tools: metadata governance app.mount(document_app) # 3 tools: document operations app.mount(index_app) # 3 tools: index management app.mount(search_app) # 2 tools: search & validation app.mount(batch_app) # 2 tools: batch operations
  • Mounts the unified elasticsearch_server_app (which includes create_index via sub-server mounting) into the main AgentKnowledgeMCP server.
    from src.elasticsearch.elasticsearch_server import app as elasticsearch_server_app from src.prompts.prompt_server import app as prompt_server_app # Import middleware from src.middleware.confirmation_middleware import ConfirmationMiddleware # Load configuration and initialize components CONFIG = load_config() init_security(CONFIG["security"]["allowed_base_directory"]) # Initialize confirmation manager confirmation_manager = initialize_confirmation_manager(CONFIG) print(f"βœ… Confirmation system initialized (enabled: {CONFIG.get('confirmation', {}).get('enabled', True)})") # Auto-setup Elasticsearch if needed print("πŸ” Checking Elasticsearch configuration...") config_path = Path(__file__).parent / "config.json" setup_result = auto_setup_elasticsearch(config_path, CONFIG) if setup_result["status"] == "setup_completed": # Reload config after setup CONFIG = load_config() print("βœ… Elasticsearch auto-setup completed") elif setup_result["status"] == "already_configured": print("βœ… Elasticsearch already configured") elif setup_result["status"] == "setup_failed": print(f"⚠️ Elasticsearch auto-setup failed: {setup_result.get('error', 'Unknown error')}") print("πŸ“ You can manually setup using the 'setup_elasticsearch' tool") init_elasticsearch(CONFIG) # Create main FastMCP server app = FastMCP( name=CONFIG["server"]["name"], version=CONFIG["server"]["version"], instructions="πŸ—οΈ AgentKnowledgeMCP - Modern FastMCP server with modular composition architecture for knowledge management, Elasticsearch operations, file management, and system administration" ) # ================================ # MIDDLEWARE CONFIGURATION # ================================ print("πŸ”’ Adding confirmation middleware...") # Add confirmation middleware to main server app.add_middleware(ConfirmationMiddleware()) print("βœ… Confirmation middleware added successfully!") # ================================ # SERVER COMPOSITION - MOUNTING # ================================ print("πŸ—οΈ Mounting individual servers into main server...") # Mount Elasticsearch server with 'es' prefix # This provides: es_search, es_index_document, es_create_index, etc. app.mount(elasticsearch_server_app)
  • Input schema definition for the create_index tool using Pydantic Field descriptions (inline in handler).
    index: Annotated[str, Field(description="Name of the new Elasticsearch index to create")], mapping: Annotated[ Dict[str, Any], Field(description="Index mapping configuration defining field types and properties")], settings: Annotated[Optional[Dict[str, Any]], Field( description="Optional index settings for shards, replicas, analysis, etc.")] = None ) -> str:
  • Import of get_es_client helper used in create_index for Elasticsearch client access.
    from ..elasticsearch_client import get_es_client

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itshare4u/AgentKnowledgeMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server