Kobold MCP Server
Kobold MCP Server
A Model Context Protocol (MCP) server implementation for interfacing with KoboldAI. This server enables integration between KoboldAI's text generation capabilities and MCP-compatible applications.
Features
- Text generation with KoboldAI
- Chat completion with persistent memory
- OpenAI-compatible API endpoints
- Stable Diffusion integration
- Built on the official MCP SDK
- TypeScript implementation
<a href="https://glama.ai/mcp/servers/a2xd4hoij7"><img width="380" height="200" src="https://glama.ai/mcp/servers/a2xd4hoij7/badge" alt="Kobold Server MCP server" /></a>
Installation
Prerequisites
- Node.js (v16 or higher)
- npm or yarn package manager
- Running KoboldAI instance
Usage
Configuration
The server can be configured through environment variables or a configuration object:
Supported APIs
- Core KoboldAI API (text generation, model info)
- Chat completion with conversation memory
- Text completion (OpenAI-compatible)
- Stable Diffusion integration (txt2img, img2img)
- Audio transcription and text-to-speech
- Web search capabilities
Development
- Clone the repository:
- Install dependencies:
- Build the project:
Dependencies
@modelcontextprotocol/sdk
: ^1.0.1node-fetch
: ^2.6.1zod
: ^3.20.0zod-to-json-schema
: ^3.23.5
Contributing
Contributions welcome! Please feel free to submit a Pull Request.
License
MIT License - see LICENSE file for details.
Support
For issues and feature requests, please use the GitHub issue tracker.
You must be authenticated.
A server enabling integration between KoboldAI's text generation capabilities and MCP-compatible applications, with features like chat completion, Stable Diffusion, and OpenAI-compatible API endpoints.