Supports containerized deployment of Hayhooks server for serving Haystack pipelines and agents.
Hosts the Hayhooks repository, issue tracking, and continuous integration workflows.
Provides OpenAI-compatible chat completion endpoints with streaming support, allowing Haystack pipelines and agents to be used as drop-in replacements for OpenAI APIs.
Distributes Hayhooks as a Python package for installation and dependency management.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Hayhooksdeploy the RAG pipeline for our customer support docs"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Hayhooks
Hayhooks makes it easy to deploy and serve Haystack Pipelines and Agents.
With Hayhooks, you can:
đĻ Deploy your Haystack pipelines and agents as REST APIs with maximum flexibility and minimal boilerplate code.
đ ī¸ Expose your Haystack pipelines and agents over the MCP protocol, making them available as tools in AI dev environments like Cursor or Claude Desktop. Under the hood, Hayhooks runs as an MCP Server, exposing each pipeline and agent as an MCP Tool.
đŦ Integrate your Haystack pipelines and agents with as OpenAI-compatible chat completion backends with streaming support.
đšī¸ Control Hayhooks core API endpoints through chat - deploy, undeploy, list, or run Haystack pipelines and agents by chatting with Claude Desktop, Cursor, or any other MCP client.
Documentation
đ For detailed guides, examples, and API reference, check out our
Quick Start
1. Install Hayhooks
2. Start Hayhooks
3. Create a simple agent
Create a minimal agent wrapper with streaming chat support and a simple HTTP POST API:
Save as my_agent_dir/pipeline_wrapper.py.
4. Deploy it
5. Run it
Call the HTTP POST API (/my_agent/run):
Call the OpenAI-compatible chat completion API (streaming enabled):
Or integrate it with Open WebUI and start chatting with it!
Key Features
đ Easy Deployment
Deploy Haystack pipelines and agents as REST APIs with minimal setup
Support for both YAML-based and wrapper-based pipeline deployment
Automatic OpenAI-compatible endpoint generation
đ Multiple Integration Options
MCP Protocol: Expose pipelines as MCP tools for use in AI development environments
Open WebUI Integration: Use Hayhooks as a backend for Open WebUI with streaming support
OpenAI Compatibility: Seamless integration with OpenAI-compatible tools and frameworks
đ§ Developer Friendly
CLI for easy pipeline management
Flexible configuration options
Comprehensive logging and debugging support
Custom route and middleware support
đ File Upload Support
Built-in support for handling file uploads in pipelines
Perfect for RAG systems and document processing
Next Steps
Quick Start Guide - Get started with Hayhooks
Installation - Install Hayhooks and dependencies
Configuration - Configure Hayhooks for your needs
Examples - Explore example implementations
Community & Support
GitHub: deepset-ai/hayhooks
Issues: GitHub Issues
Documentation: Full Documentation
Hayhooks is actively maintained by the deepset team.