Adobe Commerce Support MCP Server
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Adobe Commerce Support MCP Servergenerate a support reply from my findings in find.md"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Adobe Commerce Support MCP Server
This MCP server helps generate professional Adobe Commerce support responses from case findings.
Installation
Dependencies are already installed ✅
Server is ready to use ✅
Usage with Cursor
To use this MCP server with Cursor, add the following configuration to your Cursor settings:
Option 1: Add to Cursor Settings (Recommended)
Open Cursor Settings (Cmd+,)
Search for "MCP"
Add this server configuration:
{
"mcpServers": {
"adobe-support-mcp": {
"command": "python3",
"args": ["/Users/kavingas/VSCodeProjects/mcp_support_server/mcp_support_server.py"],
"env": {}
}
}
}Option 2: Use the provided config file
The mcp_config.json file in this directory contains the ready-to-use configuration.
How to Use
Option 1: Structured Content (Traditional)
Create a
find.mdfile with properly structured content:
### Findings
[Your investigation findings here]
### Analysis
[Your technical analysis here]
### Recommendations
[Your recommended solutions here]Use the
generate_support_replytool to create a professional customer response.
Option 2: Mixed Content (New - LLM Assisted)
Create a
find.mdfile with mixed/unstructured content (any format)Use either:
categorize_mixed_contenttool first, then use LLM to process the categorization promptgenerate_support_replywithauto_categorize=true(default) for automatic handling
The server will automatically detect if your content needs categorization and provide LLM prompts to organize it properly.
Tools Available
generate_support_reply: Generates a professional Adobe Commerce support response from case findings
find_file: Input file (default: "find.md")resp_file: Output file (default: "resp.md")auto_categorize: Automatically handle mixed content (default: true)
categorize_mixed_content: Creates LLM prompts to categorize mixed content into structured format
find_file: Input file with mixed content (default: "find.md")categorized_file: Output file with categorization prompt (default: "categorized.md")
Files
mcp_support_server.py- Main MCP serverrequirements.txt- Python dependenciesmcp_config.json- Cursor configurationsetup.py- Installation script
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/kavingas/mcp_support_server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server