Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Jibun ServerShow me the latest posts from River's Lighthouse"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Jibun Server
English | 简体中文
This is a Model Context Protocol (MCP) server that allows AI Agents to read posts from Jibun instances. It is also compatible with Ech0 instances.
Features
Provides the following tools for AI Agents to call:
get_jibun_posts: Get the latest posts from a Jibun (or Ech0) instance. Supports specifying source (source), count (count), and page number (page).list_jibun_sources: List all configured Jibun (or Ech0) instance sources.
Development and Deployment
1. Local Development and Cloudflare Workers
We use wrangler for local development and Cloudflare deployment.
Configuration:
Configure environment variables in the vars field of the wrangler.jsonc file:
Common Commands:
Start local server:
pnpm devDeploy to Cloudflare:
pnpm run deploy
2. Deploy to Vercel / Netlify
This project is based on the Hono framework and can automatically adapt to Vercel and Netlify runtime environments. It can be deployed with one click by connecting to a Git repository, but please note to configure the correct environment variable in the management panel: JIBUN_INSTANCES.
Format: A JSON string containing a list of Jibun instances.
Example:
Client Configuration
MCP Server URL:
Cloudflare:
https://<your-worker-subdomain>.workers.devVercel:
https://<your-project>.vercel.appNetlify:
https://<your-site>.netlify.app
Supports / or /mcp paths.