Skip to main content
Glama

AutoDocs MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
AUTODOCS_CACHE_DIRNoCache directory location~/.autodocs/cache
AUTODOCS_LOG_LEVELNoLogging levelINFO
AUTODOCS_MAX_CONCURRENTNoMaximum concurrent PyPI requests10
AUTODOCS_REQUEST_TIMEOUTNoRequest timeout in seconds30

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
scan_dependencies

Scan project dependencies from pyproject.toml

Args: project_path: Path to project directory (defaults to current directory)

Returns: JSON with dependency specifications and project metadata

get_package_docs

Retrieve formatted documentation for a package with version-based caching.

Args: package_name: Name of the package to fetch documentation for version_constraint: Version constraint from dependency scanning query: Optional query to filter documentation sections

Returns: Formatted documentation with package metadata

refresh_cache

Refresh the local documentation cache.

Returns: Statistics about cache refresh operation

get_cache_stats

Get statistics about the documentation cache.

Returns: Cache statistics and information

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bradleyfay/autodoc-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server