ask_model
Query specific Ollama models for answers by submitting questions. Integrates with MCP clients to facilitate direct interaction and information retrieval from chosen models.
Instructions
Ask a question to a specific Ollama model
Args:
model: Name of the model to use (e.g., 'llama2')
question: The question to ask the model
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | ||
| question | Yes |
Implementation Reference
- src/mcp_ollama/server.py:73-91 (handler)The core handler function for the 'ask_model' tool. It is registered via the @mcp.tool() decorator and implements the logic to query the specified Ollama model using ollama.chat(), returning the model's response or an error message.@mcp.tool() async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """ try: response = ollama.chat( model=model, messages=[{ 'role': 'user', 'content': question }] ) return response['message']['content'] except Exception as e: return f"Error querying model: {str(e)}"
- src/mcp_ollama/server.py:74-80 (schema)Input schema defined by type hints (model: str, question: str) and documentation docstring describing parameters.async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """
- src/mcp_ollama/server.py:73-73 (registration)Registers the ask_model tool with the MCP server using the FastMCP decorator.@mcp.tool()