Skip to main content
Glama

Self-hosted LLM MCP Server

test-mcp.shโ€ข2.39 kB
#!/bin/bash echo "๐Ÿงช Testing MCP Server..." # Colors for output RED='\033[0;31m' GREEN='\033[0;32m' YELLOW='\033[1;33m' NC='\033[0m' # No Color # Function to test endpoint test_endpoint() { local name="$1" local method="$2" local url="$3" local data="$4" echo -e "\n${YELLOW}Testing: $name${NC}" if [ -n "$data" ]; then response=$(curl -s -X "$method" "$url" \ -H "Content-Type: application/json" \ -d "$data") else response=$(curl -s -X "$method" "$url") fi if [ $? -eq 0 ]; then echo -e "${GREEN}โœ… $name: SUCCESS${NC}" echo "$response" | jq . 2>/dev/null || echo "$response" else echo -e "${RED}โŒ $name: FAILED${NC}" fi } # 1. Health Check test_endpoint "Health Check" "GET" "http://localhost:3000/health" # 2. List Tools test_endpoint "List Tools" "POST" "http://localhost:3000/mcp/tools/list" # 3. Generate Text test_endpoint "Generate Text" "POST" "http://localhost:3000/mcp/tools/call" '{ "name": "generate_text", "arguments": { "prompt": "Say hello in a creative way", "maxTokens": 50 } }' # 4. Store Data test_endpoint "Store Data" "POST" "http://localhost:3000/mcp/tools/call" '{ "name": "store_data", "arguments": { "table": "test_data", "data": { "message": "Hello from MCP Server!", "timestamp": "'$(date -u +%Y-%m-%dT%H:%M:%SZ)'", "test": true } } }' # 5. Retrieve Data test_endpoint "Retrieve Data" "POST" "http://localhost:3000/mcp/tools/call" '{ "name": "retrieve_data", "arguments": { "table": "test_data", "limit": 5 } }' # 6. Query Database test_endpoint "Query Database" "POST" "http://localhost:3000/mcp/tools/call" '{ "name": "query_database", "arguments": { "query": "SELECT 1 as test_value", "table": "test" } }' # 7. Error Testing test_endpoint "Invalid Tool" "POST" "http://localhost:3000/mcp/tools/call" '{ "name": "invalid_tool", "arguments": {} }' echo -e "\n${GREEN}๐ŸŽ‰ MCP Server testing completed!${NC}" echo -e "\n${YELLOW}Next steps:${NC}" echo "1. Check the responses above for any errors" echo "2. Verify your Supabase connection is working" echo "3. Ensure Ollama is running on port 11434" echo "4. Check Docker logs: docker-compose logs mcp-server"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishnahuex28/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server