Skip to main content
Glama

Zignet

by fulgidus
Do What The F*ck You Want To Public License
README.md1.37 kB
# zignet MCP server for Zig — AI-powered code generation, debugging, and documentation ## Features - **generate_zig_code**: Generate idiomatic Zig code from natural language descriptions - **debug_zig_code**: Analyze and debug Zig code with AI assistance - **explain_zig_docs**: Get explanations of Zig language features and standard library ## Installation ```bash npm install ``` ## Usage Start the MCP server: ```bash npm start ``` Or use the CLI directly: ```bash ./bin/cli.js ``` ## Configuration Environment variables: - `ZIGNET_CACHE_DIR`: Model cache directory (default: `~/.cache/zignet`) - `ZIGNET_MODEL_NAME`: Model filename (default: `tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf`) - `ZIGNET_MODEL_REPO`: HuggingFace repository (default: `TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF`) - `ZIGNET_MODEL_FILE`: Model file in repository (default: `tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf`) ## Structure - `bin/cli.js`: Executable entrypoint, downloads model and starts MCP stdio server - `src/server.js`: MCP server implementation with tool handlers - `src/model.js`: Model loading and text generation using node-llama-cpp - `src/tools.js`: Tool definitions for the 3 AI tools - `src/download.js`: Model caching and download logic - `scripts/download-model.js`: HuggingFace model download script ## License WTFPL v2 — Do What The Fuck You Want To Public License

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/fulgidus/zignet'

If you have feedback or need assistance with the MCP directory API, please join our Discord server