Skip to main content
Glama

ask_model

Ask questions to Ollama AI models through the MCP server to get responses and insights.

Instructions

Ask a question to a specific Ollama model

Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYes
questionYes

Implementation Reference

  • The main handler function for the 'ask_model' tool. It is decorated with @mcp.tool() which registers it in the MCP server. The function takes a model name and question, sends a chat request to Ollama, and returns the response content. Includes input schema via type hints and docstring.
    @mcp.tool() async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """ try: response = ollama.chat( model=model, messages=[{ 'role': 'user', 'content': question }] ) return response['message']['content'] except Exception as e: return f"Error querying model: {str(e)}"
  • The @mcp.tool() decorator registers the ask_model function as an MCP tool.
    @mcp.tool()
  • The function signature and docstring define the input schema (model: str, question: str) and output (str).
    async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server