Skip to main content
Glama

Stata-MCP

example.toml331 B
[stata] stata_cli = "/usr/local/bin/stata-mp" [stata-mcp] output_base_path = "None" [llm] LLM_TYPE = "ollama" # Options: "ollama", "openai" [llm.ollama] MODEL = "qwen2.5-coder:7b" BASE_URL = "http://localhost:11434" [llm.openai] MODEL = "gpt-3.5-turbo" BASE_URL = "https://api.openai.com/v1" API_KEY = "<YOUR_OPENAI_API_KEY>"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/SepineTam/stata-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server