Provides AI-powered tools for Zig programming language development including code generation from natural language descriptions, code debugging and analysis, and explanations of language features and standard library documentation.
zignet
MCP server for Zig — AI-powered code generation, debugging, and documentation
Features
generate_zig_code: Generate idiomatic Zig code from natural language descriptions
debug_zig_code: Analyze and debug Zig code with AI assistance
explain_zig_docs: Get explanations of Zig language features and standard library
Installation
Usage
Start the MCP server:
Or use the CLI directly:
Configuration
Environment variables:
ZIGNET_CACHE_DIR: Model cache directory (default:~/.cache/zignet)ZIGNET_MODEL_NAME: Model filename (default:tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf)ZIGNET_MODEL_REPO: HuggingFace repository (default:TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF)ZIGNET_MODEL_FILE: Model file in repository (default:tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf)
Structure
bin/cli.js: Executable entrypoint, downloads model and starts MCP stdio serversrc/server.js: MCP server implementation with tool handlerssrc/model.js: Model loading and text generation using node-llama-cppsrc/tools.js: Tool definitions for the 3 AI toolssrc/download.js: Model caching and download logicscripts/download-model.js: HuggingFace model download script
License
WTFPL v2 — Do What The Fuck You Want To Public License
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Enables AI-powered Zig programming assistance through code generation, debugging, and documentation explanation. Uses local LLM models to provide idiomatic Zig code creation and analysis capabilities.