Skip to main content
Glama
bash20cu

Professional Python MCP Server

by bash20cu

list_models_with_limits

Lists available Gemini models sorted by input token limits to help developers select appropriate models for their context window requirements.

Instructions

Lists available Gemini models sorted by input token limit (context window). Provides a proxy for 'capacity' since exact quota is not available via API.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bash20cu/mcp-server-python'

If you have feedback or need assistance with the MCP directory API, please join our Discord server