Skip to main content
Glama

ask_model

Query specific Ollama models for answers by submitting questions. Integrates with MCP clients to facilitate direct interaction and information retrieval from chosen models.

Instructions

Ask a question to a specific Ollama model

Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYes
questionYes

Implementation Reference

  • The core handler function for the 'ask_model' tool. It is registered via the @mcp.tool() decorator and implements the logic to query the specified Ollama model using ollama.chat(), returning the model's response or an error message.
    @mcp.tool() async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """ try: response = ollama.chat( model=model, messages=[{ 'role': 'user', 'content': question }] ) return response['message']['content'] except Exception as e: return f"Error querying model: {str(e)}"
  • Input schema defined by type hints (model: str, question: str) and documentation docstring describing parameters.
    async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """
  • Registers the ask_model tool with the MCP server using the FastMCP decorator.
    @mcp.tool()

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server