Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Tools

Functions exposed to the LLM to take actions

NameDescription
githubSearchCode

Search file content or files by path

githubGetFileContent

Read file content

githubViewRepoStructure

Display directory structure

githubSearchRepositories

Search repositories by keywords/topics

githubSearchPullRequests

Search or fetch Pull Requests (metadata, diffs, discussions)

packageSearch

Find NPM/Python packages & their repository URLs

Prompts

Interactive templates invoked by user choice

NameDescription
generate-projectInteractive agent for scaffolding and generating new projects with architectural guidance and best practices, leveraging Octocode for research and feature implementation
helpComplete guide to all available Octocode prompts and tools for users
initInitialize Octocode context by gathering user requirements and preferences
planResearch-backed planning for bug fixes, features, or large refactors
researchInvestigate anything using Octocode research tools
research_localInvestigate anything using Octocode research tools on local files (grep, ls, read, metadata)
review_pull_requestComprehensive Pull Request review using Octocode tools with a Defects-First & Simplicity mental model

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bgauryy/octocode-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server