ask_model
Ask questions to Ollama AI models through the MCP server to get responses and insights.
Instructions
Ask a question to a specific Ollama model
Args:
model: Name of the model to use (e.g., 'llama2')
question: The question to ask the model
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | ||
| question | Yes |
Implementation Reference
- src/mcp_ollama/server.py:73-91 (handler)The main handler function for the 'ask_model' tool. It is decorated with @mcp.tool() which registers it in the MCP server. The function takes a model name and question, sends a chat request to Ollama, and returns the response content. Includes input schema via type hints and docstring.@mcp.tool() async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """ try: response = ollama.chat( model=model, messages=[{ 'role': 'user', 'content': question }] ) return response['message']['content'] except Exception as e: return f"Error querying model: {str(e)}"
- src/mcp_ollama/server.py:73-73 (registration)The @mcp.tool() decorator registers the ask_model function as an MCP tool.@mcp.tool()
- src/mcp_ollama/server.py:74-80 (schema)The function signature and docstring define the input schema (model: str, question: str) and output (str).async def ask_model(model: str, question: str) -> str: """Ask a question to a specific Ollama model Args: model: Name of the model to use (e.g., 'llama2') question: The question to ask the model """