Skip to main content
Glama

MCP-Repo2LLM

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GITHUB_TOKENNoYour GitHub token for accessing GitHub repositories
GITLAB_TOKENNoYour GitLab token for accessing GitLab repositories

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
get_gitlab_repo

Process and return the code from a GitLab repository branch as text

get_github_repo

Process and return the code from a GitHub repository branch as text

get_local_repo

Process and return the code from a local repository as text

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/crisschan/mcp-repo2llm'

If you have feedback or need assistance with the MCP directory API, please join our Discord server