Skip to main content
Glama

Shinkuro

by DiscreteTom

Shinkuro - Prompt synchronization MCP server

PyPI - Version

Loads markdown files from a local folder or GitHub repository and serves them as prompts.

Usage

Local Files

Add to your MCP client configuration:

{ "mcpServers": { "shinkuro": { "command": "uvx", "args": ["shinkuro"], "env": { "FOLDER": "/path/to/prompts" } } } }

Git Repository

Add to your MCP client configuration:

{ "mcpServers": { "shinkuro": { "command": "uvx", "args": ["shinkuro"], "env": { "GIT_URL": "https://github.com/owner/repo.git", "FOLDER": "" // optional, subfolder within git repo } } } }

This will clone the repository into a local cache dir. Make sure you have correct permission.

Environment Variables

  • FOLDER: Path to local folder containing markdown files, or subfolder within git repo

  • GIT_URL: Git repository URL (supports GitHub, GitLab, SSH, HTTPS with credentials)

  • CACHE_DIR: Directory to cache cloned repositories (optional, defaults to ~/.shinkuro/remote)

  • AUTO_PULL: Whether to pull latest changes if repo exists locally (optional, defaults to false)

Prompts

Loading

Each markdown file in the specified folder is loaded as a prompt.

Example folder structure:

my-prompts/ ├── code-review.md ├── dev.md └── commit.md

Example Prompt File

--- name: "" # optional, defaults to filename description: "" # optional, defaults to file path --- # Prompt Content Your prompt content here...
Deploy Server
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables loading and serving markdown files as prompts from local folders or GitHub repositories. Supports automatic repository synchronization and YAML frontmatter for prompt metadata.

  1. Usage
    1. Local Files
    2. Git Repository
    3. Environment Variables
  2. Prompts
    1. Loading
    2. Example Prompt File

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DiscreteTom/shinkuro'

If you have feedback or need assistance with the MCP directory API, please join our Discord server