Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
check-storage.shβ€’1.8 kB
#!/bin/bash # Check storage usage for Mimir and Ollama echo "πŸ“Š Mimir Storage Usage Report" echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" echo "" echo "🐳 Docker Ollama Models (Containerized):" if [ -d "./data/ollama" ]; then du -sh ./data/ollama echo " Location: ./data/ollama/models/" else echo " Not found (no models pulled yet)" fi echo "" echo "πŸ’» Host Ollama Models (Local):" if [ -d "$HOME/.ollama" ]; then du -sh ~/.ollama echo " Location: ~/.ollama/models/" else echo " Not found (Ollama not installed locally)" fi echo "" echo "πŸ—„οΈ Neo4j Database:" if [ -d "./data/neo4j" ]; then du -sh ./data/neo4j echo " Location: ./data/neo4j/" else echo " Not found" fi echo "" echo "πŸ”¨ Build Artifacts:" if [ -d "./build" ]; then du -sh ./build else echo " Not found" fi echo "" echo "πŸ“¦ Node Modules:" if [ -d "./node_modules" ]; then du -sh ./node_modules else echo " Not found" fi echo "" echo "πŸ“ Logs:" if [ -d "./logs" ]; then du -sh ./logs else echo " Not found" fi echo "" echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" echo "πŸ“Š Summary:" echo "" if [ -d "./data" ]; then echo "Total ./data directory:" du -sh ./data fi echo "" echo "🐳 Docker Images:" docker images --format "table {{.Repository}}\t{{.Tag}}\t{{.Size}}" | grep -E "REPOSITORY|mcp-server|ollama|neo4j" || echo " No images found" echo "" echo "🐳 Docker System:" docker system df echo "" echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" echo "πŸ’‘ To clean up storage, see: docs/STORAGE_CLEANUP.md" echo ""

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server