MCP Task
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| run_taskA | Start a complex AI task. Perform advanced reasoning and analysis with state of the art LLMs. Start multiple tasks at once by using an array for model. Returns a task ID immediately (or batch ID for multiple models) to check status and retrieve results. |
| check_task_statusA | Check the status of a running task. Returns current status, progress, and partial results if available. |
| get_task_resultB | Get the final result of a completed task. |
| cancel_taskA | Cancel a pending or running task, or all tasks in a batch. |
| wait_for_taskA | Wait for a task or any task in a batch to complete, fail, or be cancelled. Only waits for tasks that complete AFTER this call is made - ignores tasks that were already completed. |
| list_tasksB | List all tasks with their current status. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
| solve | Solve a complicated problem with multiple state-of-the-art LLMs |
| plan | Create a comprehensive plan using multiple state-of-the-art LLMs working in parallel |
| code | Generate or modify code using a state-of-the-art coding LLM |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/just-every/mcp-task'
If you have feedback or need assistance with the MCP directory API, please join our Discord server