### Where do I get LLM API access?
```mermaid {scale: 0.8}
flowchart TD
A[Do you have a GPU?] -->|Yes| B[Local LLM]
A -->|No| E[Do you have Azure OpenAI/AI Foundry?]
E -->|Yes| F[Azure OpenAI/AI Foundry]
E -->|No| G[Do you have GitHub Copilot?]
G -->|Yes| I([GitHub Copilot Chat Models in VS Code])
G -->|Yes| H([GitHub Models])
G -->|No| K[😢]
```
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/microsoft/genaiscript'
If you have feedback or need assistance with the MCP directory API, please join our Discord server