Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OVERLEAF_GIT_TOKENNoYour Overleaf Git token (starts with olp_)
OVERLEAF_PROJECT_IDNoThe project ID from the Overleaf URL (https://www.overleaf.com/project/[PROJECT_ID])
OVERLEAF_PROJECT_NAMENoOptional display name for the project
OVERLEAF_GIT_TOKEN_FILENoPath to a file containing the Overleaf Git token (alternative to OVERLEAF_GIT_TOKEN)
OVERLEAF_PROJECTS_CONFIGNoPath to a projects.json file for multiple projects

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
list_filesC

List all files in an Overleaf project

read_fileC

Read a file from an Overleaf project

get_sectionsC

Get all sections from a LaTeX file

get_section_contentC

Get content of a specific section

status_summaryC

Get a summary of the project status using default credentials

list_projectsB

List all available projects

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mjyoo2/OverleafMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server