Skip to main content
Glama
settings.local.json1.61 kB
{ "permissions": { "allow": [ "Bash(uv init --python 3.11)", "Bash(export PATH=\"$HOME/.local/bin:$PATH\")", "Bash(uv add --dev grpcio-tools)", "mcp__context7__get-library-docs", "Bash(uv add python-dotenv)", "Bash(uv run python test_basic.py)", "mcp__nix-mcp__nix_get_global_configs", "Bash(cat /Users/hai.qiu/workspace/nix-mcp/doc/nix.q.abi.json)", "Bash(uv pip show mcp)", "Bash(python -c \"import mcp; print(mcp.__version__)\")", "Bash(/Users/hai.qiu/workspace/nix-mcp/.venv/bin/python -c \"import mcp; print(mcp.__version__)\")", "Bash(/Users/hai.qiu/workspace/nix-mcp/.venv/bin/python -c \"import mcp; print(dir(mcp))\")", "Bash(/Users/hai.qiu/workspace/nix-mcp/.venv/bin/python -c \"from mcp import server; print(dir(server))\")", "Bash(/Users/hai.qiu/workspace/nix-mcp/.venv/bin/python -m nix_mcp.server_fastmcp)", "Bash(/Users/hai.qiu/workspace/nix-mcp/.venv/bin/python test_resolver_v2.py globalconfn)", "Bash(make run)", "Bash(make check)", "Read(//Users/hai.qiu/**)", "Bash(uv sync)", "Bash(uv run python test_server_standalone.py)", "mcp__nix-mcp__query", "mcp__nix-mcp__list_queries", "mcp__nix-mcp__get_query_abi", "Read(//Users/hai.qiu/.nix-mcp/logs/**)", "mcp__context7__resolve-library-id" ], "additionalDirectories": [ "/Users/hai.qiu/workspace/b1x-exchange", "/Users/hai.qiu/workspace/b1x-cap/native-indexer-oracles/" ] }, "enableAllProjectMcpServers": true, "enabledMcpjsonServers": [ "nix-mcp" ] }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/haiqiubullish/nix-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server