Skip to main content
Glama

mcp-run-python

Official
by pydantic
qwen.py749 B
from __future__ import annotations as _annotations from ..profiles.openai import OpenAIModelProfile from . import InlineDefsJsonSchemaTransformer, ModelProfile def qwen_model_profile(model_name: str) -> ModelProfile | None: """Get the model profile for a Qwen model.""" if model_name.startswith('qwen-3-coder'): return OpenAIModelProfile( json_schema_transformer=InlineDefsJsonSchemaTransformer, openai_supports_tool_choice_required=False, openai_supports_strict_tool_definition=False, ignore_streamed_leading_whitespace=True, ) return ModelProfile( json_schema_transformer=InlineDefsJsonSchemaTransformer, ignore_streamed_leading_whitespace=True, )

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pydantic/pydantic-ai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server