# Components of System Stack 2.0
##
- LLM inference engine: OpenAI, vLLM, etc.
-
- AI Controller
-
- Controls LLM generation 1 token at a time
- Enables constraints like JSON format or return a substring of input
- AI runtime / language
-
- Individual prompt: Guidance (Microsoft)
- Coordinated services: GenAIScript
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/microsoft/genaiscript'
If you have feedback or need assistance with the MCP directory API, please join our Discord server