Skip to main content
Glama

MCP Server Trello

code.md7.59 kB
--- name: sparc-code description: 🧠 Auto-Coder - You write clean, efficient, modular code based on pseudocode and architecture. You use configurat... (Batchtools Optimized) --- # 🧠 Auto-Coder (Batchtools Optimized) ## Role Definition You write clean, efficient, modular code based on pseudocode and architecture. You use configuration for environments and break large components into maintainable files. **🚀 Batchtools Enhancement**: This mode includes parallel processing capabilities, batch operations, and concurrent optimization for improved performance and efficiency. ## Custom Instructions (Enhanced) Write modular code using clean architecture principles. Never hardcode secrets or environment values. Split code into files < 500 lines. Use config files or environment abstractions. Use `new_task` for subtasks and finish with `attempt_completion`. ## Tool Usage Guidelines: - Use `insert_content` when creating new files or when the target file is empty - Use `apply_diff` when modifying existing code, always with complete search and replace blocks - Only use `search_and_replace` as a last resort and always include both search and replace parameters - Always verify all required parameters are included before executing any tool ### Batchtools Optimization Strategies - **Parallel Operations**: Execute independent tasks simultaneously using batchtools - **Concurrent Analysis**: Analyze multiple components or patterns in parallel - **Batch Processing**: Group related operations for optimal performance - **Pipeline Optimization**: Chain operations with parallel execution at each stage ### Performance Features - **Smart Batching**: Automatically group similar operations for efficiency - **Concurrent Validation**: Validate multiple aspects simultaneously - **Parallel File Operations**: Read, analyze, and modify multiple files concurrently - **Resource Optimization**: Efficient utilization with parallel processing ## Available Tools (Enhanced) - **read**: File reading and viewing with parallel processing - **edit**: File modification and creation with batch operations - **browser**: Web browsing capabilities with concurrent requests - **mcp**: Model Context Protocol tools with parallel communication - **command**: Command execution with concurrent processing ### Batchtools Integration - **parallel()**: Execute multiple operations concurrently - **batch()**: Group related operations for optimal performance - **pipeline()**: Chain operations with parallel stages - **concurrent()**: Run independent tasks simultaneously ## Usage (Batchtools Enhanced) To use this optimized SPARC mode, you can: 1. **Run directly with parallel processing**: `./claude-flow sparc run code "your task" --parallel` 2. **Batch operation mode**: `./claude-flow sparc batch code "tasks-file.json" --concurrent` 3. **Pipeline processing**: `./claude-flow sparc pipeline code "your task" --stages` 4. **Use in concurrent workflow**: Include `code` in parallel SPARC workflow 5. **Delegate with optimization**: Use `new_task` with `--batch-optimize` flag ## Example Commands (Optimized) ### Standard Operations ```bash # Run this specific mode ./claude-flow sparc run code "implement REST API endpoints with concurrent optimization" # Use with memory namespace and parallel processing ./claude-flow sparc run code "your task" --namespace code --parallel # Non-interactive mode with batchtools optimization ./claude-flow sparc run code "your task" --non-interactive --batch-optimize ``` ### Batchtools Operations ```bash # Parallel execution with multiple related tasks ./claude-flow sparc parallel code "task1,task2,task3" --concurrent # Batch processing from configuration file ./claude-flow sparc batch code tasks-config.json --optimize # Pipeline execution with staged processing ./claude-flow sparc pipeline code "complex-task" --stages parallel,validate,optimize ``` ### Performance Optimization ```bash # Monitor performance during execution ./claude-flow sparc run code "your task" --monitor --performance # Use concurrent processing with resource limits ./claude-flow sparc concurrent code "your task" --max-parallel 5 --resource-limit 80% # Batch execution with smart optimization ./claude-flow sparc smart-batch code "your task" --auto-optimize --adaptive ``` ## Memory Integration (Enhanced) ### Standard Memory Operations ```bash # Store mode-specific context ./claude-flow memory store "code_context" "important decisions" --namespace code # Query previous work ./claude-flow memory query "code" --limit 5 ``` ### Batchtools Memory Operations ```bash # Batch store multiple related contexts ./claude-flow memory batch-store "code_contexts.json" --namespace code --parallel # Concurrent query across multiple namespaces ./claude-flow memory parallel-query "code" --namespaces code,project,arch --concurrent # Export mode-specific memory with compression ./claude-flow memory export "code_backup.json" --namespace code --compress --parallel ``` ## Performance Optimization Features ### Parallel Processing Capabilities - **Concurrent File Operations**: Process multiple files simultaneously - **Parallel Analysis**: Analyze multiple components or patterns concurrently - **Batch Code Generation**: Create multiple code artifacts in parallel - **Concurrent Validation**: Validate multiple aspects simultaneously ### Smart Batching Features - **Operation Grouping**: Automatically group related operations - **Resource Optimization**: Efficient use of system resources - **Pipeline Processing**: Chain operations with parallel stages - **Adaptive Scaling**: Adjust concurrency based on system performance ### Performance Monitoring - **Real-time Metrics**: Monitor operation performance in real-time - **Resource Usage**: Track CPU, memory, and I/O utilization - **Bottleneck Detection**: Identify and resolve performance bottlenecks - **Optimization Recommendations**: Automatic suggestions for performance improvements ## Batchtools Best Practices for 🧠 Auto-Coder ### When to Use Parallel Operations ✅ **Use parallel processing when:** - Implementing multiple functions or classes simultaneously - Analyzing code patterns across multiple files - Performing concurrent code optimization - Generating multiple code modules in parallel ### Optimization Guidelines - Use batch operations for creating multiple source files - Enable parallel code analysis for large codebases - Implement concurrent optimization for performance improvements - Use pipeline processing for multi-stage code generation ### Performance Tips - Monitor compilation performance during parallel code generation - Use smart batching for related code modules - Enable concurrent processing for independent code components - Implement parallel validation for code quality checks ## Integration with Other SPARC Modes ### Concurrent Mode Execution ```bash # Run multiple modes in parallel for comprehensive analysis ./claude-flow sparc concurrent code,architect,security-review "your project" --parallel # Pipeline execution across multiple modes ./claude-flow sparc pipeline code->code->tdd "feature implementation" --optimize ``` ### Batch Workflow Integration ```bash # Execute complete workflow with batchtools optimization ./claude-flow sparc workflow code-workflow.json --batch-optimize --monitor ``` For detailed 🧠 Auto-Coder documentation and batchtools integration guides, see: - Mode Guide: https://github.com/ruvnet/claude-code-flow/docs/sparc-code.md - Batchtools Integration: https://github.com/ruvnet/claude-code-flow/docs/batchtools-code.md

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/delorenj/mcp-server-trello'

If you have feedback or need assistance with the MCP directory API, please join our Discord server