Skip to main content
Glama
mm-repos

Azure AI Search MCP Server

by mm-repos

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GEMINI_MODELNoThe Gemini model to usegemini-1.5-flash
GOOGLE_API_KEYNoYour Google API key for Gemini (optional, for enhanced AI processing)
LANGCHAIN_API_KEYNoYour LangSmith API key (optional, for tracing and debugging)
LANGCHAIN_PROJECTNoThe LangChain project nameazure-search-mcp
GEMINI_TEMPERATURENoThe temperature setting for Gemini model0.1
LANGCHAIN_ENDPOINTNoThe LangChain API endpointhttps://api.smith.langchain.com
AZURE_SEARCH_API_KEYYesYour Azure AI Search admin API key
LANGCHAIN_TRACING_V2NoEnable LangChain tracing v2 for debuggingtrue
AZURE_SEARCH_ENDPOINTYesThe endpoint URL of your Azure AI Search service (e.g., https://your-search-service.search.windows.net)
AZURE_SEARCH_INDEX_NAMEYesThe name of your Azure AI Search index

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mm-repos/langgraph-claude-azure-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server