Skip to main content
Glama

Noctua MCP Server

Official
by geneontology
action.yml5.34 kB
name: "Claude Code Action" description: "Run Claude Code in GitHub Actions workflows" inputs: github_token: description: "GitHub token with repo and issues permissions" required: true anthropic_api_key: description: "Anthropic API key" required: true cborg_api_key: description: "CBORG API key" required: true prompt: description: "The prompt to send to Claude Code" required: false default: "" prompt_file: description: "Path to a file containing the prompt to send to Claude Code" required: false default: "" allowed_tools: description: "Comma-separated list of allowed tools for Claude Code to use" required: false default: "" output_file: description: "File to save Claude Code output to (optional)" required: false default: "" timeout_minutes: description: "Timeout in minutes for Claude Code execution" required: false default: "10" install_github_mcp: description: "Whether to install the GitHub MCP server" required: false default: "false" install_artl_mcp: description: "Whether to install the ARTL MCP server" required: false default: "false" runs: using: "composite" steps: - name: Install uvx shell: bash run: | curl -LsSf https://astral.sh/uv/install.sh | sh source $HOME/.cargo/env which uvx || echo "uvx not found in PATH, installing via pip" pip install uv || true - name: Install Claude Code shell: bash run: npm install -g @anthropic-ai/claude-code - name: Install GitHub MCP Server if: inputs.install_github_mcp == 'true' shell: bash run: | claude mcp add-json github '{ "command": "docker", "args": [ "run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server:sha-ff3036d" ], "env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "${{ inputs.GITHUB_TOKEN }}" } }' - name: Install ARTL MCP Server if: inputs.install_artl_mcp == 'true' shell: bash run: | claude mcp add-json artl '{ "command": "uvx", "args": [ "artl-mcp" ] }' - name: Prepare Prompt File shell: bash id: prepare_prompt run: | # Check if either prompt or prompt_file is provided if [ -z "${{ inputs.prompt }}" ] && [ -z "${{ inputs.prompt_file }}" ]; then echo "::error::Neither 'prompt' nor 'prompt_file' was provided. At least one is required." exit 1 fi # Determine which prompt source to use if [ ! -z "${{ inputs.prompt_file }}" ]; then # Check if the prompt file exists if [ ! -f "${{ inputs.prompt_file }}" ]; then echo "::error::Prompt file '${{ inputs.prompt_file }}' does not exist." exit 1 fi # Use the provided prompt file PROMPT_PATH="${{ inputs.prompt_file }}" else mkdir -p /tmp/claude-action PROMPT_PATH="/tmp/claude-action/prompt.txt" echo "${{ inputs.prompt }}" > "$PROMPT_PATH" fi # Verify the prompt file is not empty if [ ! -s "$PROMPT_PATH" ]; then echo "::error::Prompt is empty. Please provide a non-empty prompt." exit 1 fi # Save the prompt path for the next step echo "PROMPT_PATH=$PROMPT_PATH" >> $GITHUB_ENV - name: Run Claude Code shell: bash id: run_claude run: | ALLOWED_TOOLS_ARG="" if [ ! -z "${{ inputs.allowed_tools }}" ]; then ALLOWED_TOOLS_ARG="--allowedTools ${{ inputs.allowed_tools }}" fi # Set a timeout to ensure the command doesn't run indefinitely timeout_seconds=$((${{ inputs.timeout_minutes }} * 60)) if [ -z "${{ inputs.output_file }}" ]; then # Run Claude Code and output to console timeout $timeout_seconds claude \ -p \ --verbose \ --output-format stream-json \ "$(cat ${{ env.PROMPT_PATH }})" \ ${{ inputs.allowed_tools != '' && format('--allowedTools "{0}"', inputs.allowed_tools) || '' }} else # Run Claude Code and tee output to console and file timeout $timeout_seconds claude \ -p \ --verbose \ --output-format stream-json \ "$(cat ${{ env.PROMPT_PATH }})" \ ${{ inputs.allowed_tools != '' && format('--allowedTools "{0}"', inputs.allowed_tools) || '' }} | tee output.txt # Process output.txt into JSON in a separate step jq -s '.' output.txt > output.json # Extract the result from the last item in the array (system message) jq -r '.[-1].result' output.json > "${{ inputs.output_file }}" echo "Complete output saved to output.json, final response saved to ${{ inputs.output_file }}" fi env: ANTHROPIC_API_KEY: "." ANTHROPIC_AUTH_TOKEN: ${{ inputs.cborg_api_key }} GITHUB_TOKEN: ${{ inputs.github_token }} ANTHROPIC_BASE_URL: "https://api.cborg.lbl.gov" DISABLE_NON_ESSENTIAL_MODEL_CALLS: "1"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/geneontology/noctua-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server