Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
build-llama-cpp.sh1.25 kB
#!/bin/bash # Build and publish llama.cpp server for ARM64 set -e # Configuration IMAGE_NAME="timothyswt/llama-cpp-server-arm64" VERSION="${1:-latest}" DOCKER_USERNAME="${DOCKER_USERNAME:-timothyswt}" echo "🔨 Building llama.cpp server for ARM64..." echo "Image: $IMAGE_NAME:$VERSION" echo "" # Build the image docker build \ --platform linux/arm64 \ -t "$IMAGE_NAME:$VERSION" \ -t "$IMAGE_NAME:latest" \ -f docker/llama-cpp/Dockerfile \ . echo "" echo "✅ Build complete!" echo "" echo "🔍 Image details:" docker images | grep llama-cpp-server-arm64 | head -n 2 echo "" echo "📦 Pushing to Docker Hub..." echo " (Make sure you're logged in: docker login)" echo "" # Ask for confirmation read -p "Push to Docker Hub? (y/N) " -n 1 -r echo if [[ $REPLY =~ ^[Yy]$ ]]; then docker push "$IMAGE_NAME:$VERSION" if [ "$VERSION" != "latest" ]; then docker push "$IMAGE_NAME:latest" fi echo "✅ Published to Docker Hub!" else echo "⏭️ Skipped push. To push manually:" echo " docker push $IMAGE_NAME:$VERSION" echo " docker push $IMAGE_NAME:latest" fi echo "" echo "🎉 Done! To use this image:" echo " docker run -p 11434:8080 -v ./models:/models $IMAGE_NAME:$VERSION"

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server