Model Context Protocol (MCP) Server for Graphlit Platform
Overview
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. This document outlines the setup process and provides a basic example of using the client.
Ingest anything from Slack, Discord, websites, Google Drive, email, Jira, Linear or GitHub into a Graphlit project - and then search and retrieve relevant knowledge within an MCP client like Cursor, Windsurf, Goose or Cline.
Your Graphlit project acts as a searchable, and RAG-ready knowledge base across all your developer and product management tools.
Documents (PDF, DOCX, PPTX, etc.) and HTML web pages will be extracted to Markdown upon ingestion. Audio and video files will be transcribed upon ingestion.
Web crawling and web search are built-in as MCP tools, with no need to integrate other tools like Firecrawl, Exa, etc. separately.
You can read more about the MCP Server use cases and features on our blog.
Watch our latest YouTube video on using the Graphlit MCP Server with the Goose MCP client.
For any questions on using the MCP Server, please join our Discord community and post on the #mcp channel.
Tools
Retrieval
- Query Contents
- Query Collections
- Query Feeds
- Query Conversations
- Retrieve Relevant Sources
- Retrieve Similar Images
- Visually Describe Image
RAG
- Prompt LLM Conversation
Extraction
- Extract Structured JSON from Text
Publishing
- Publish as Audio (ElevenLabs Audio)
- Publish as Image (OpenAI Image Generation)
Ingestion
- Files
- Web Pages
- Messages
- Posts
- Emails
- Issues
- Text
- Memory (Short-Term)
Data Connectors
- Microsoft Outlook email
- Google Mail
- Notion
- Linear
- Jira
- GitHub Issues
- Google Drive
- OneDrive
- SharePoint
- Dropbox
- Box
- GitHub
- Slack
- Microsoft Teams
- Discord
- Twitter/X
- Podcasts (RSS)
Web
- Web Crawling
- Web Search (including Podcast Search)
- Web Mapping
- Screenshot Page
Notifications
- Slack
- Webhook
- Twitter/X
Operations
- Configure Project
- Create Collection
- Add Contents to Collection
- Remove Contents from Collection
- Delete Collection(s)
- Delete Feed(s)
- Delete Content(s)
- Delete Conversation(s)
- Is Feed Done?
- Is Content Done?
Enumerations
- List Slack Channels
- List Microsoft Teams Teams
- List Microsoft Teams Channels
- List SharePoint Libraries
- List SharePoint Folders
- List Linear Projects
- List Notion Databases
- List Notion Pages
- List Dropbox Folders
- List Box Folders
- List Discord Guilds
- List Discord Channels
- List Google Calendars
- List Microsoft Calendars
Resources
- Project
- Contents
- Feeds
- Collections (of Content)
- Workflows
- Conversations
- Specifications
Prerequisites
Before you begin, ensure you have the following:
- Node.js installed on your system (recommended version 18.x or higher).
- An active account on the Graphlit Platform with access to the API settings dashboard.
Configuration
The Graphlit MCP Server supports environment variables to be set for authentication and configuration:
GRAPHLIT_ENVIRONMENT_ID
: Your environment ID.GRAPHLIT_ORGANIZATION_ID
: Your organization ID.GRAPHLIT_JWT_SECRET
: Your JWT secret for signing the JWT token.
You can find these values in the API settings dashboard on the Graphlit Platform.
Installation
Installing via VS Code
For quick installation, use one of the one-click install buttons below:
For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
Note that the
mcp
key is not needed in the.vscode/mcp.json
file.
Installing via Windsurf
To install graphlit-mcp-server in Windsurf IDE application, Cline should use NPX:
Your mcp_config.json file should be configured similar to:
Installing via Cline
To install graphlit-mcp-server in Cline IDE application, Cline should use NPX:
Your cline_mcp_settings.json file should be configured similar to:
Installing via Cursor
To install graphlit-mcp-server in Cursor IDE application, Cursor should use NPX:
Your mcp.json file should be configured similar to:
Installing via Smithery
To install graphlit-mcp-server for Claude Desktop automatically via Smithery:
Installing manually
To use the Graphlit MCP Server in any MCP client application, use:
Optionally, you can configure the credentials for data connectors, such as Slack, Google Email and Notion. Only GRAPHLIT_ORGANIZATION_ID, GRAPHLIT_ENVIRONMENT_ID and GRAPHLIT_JWT_SECRET are required.
NOTE: when running 'npx' on Windows, you may need to explicitly call npx via the command prompt.
Support
Please refer to the Graphlit API Documentation.
For support with the Graphlit MCP Server, please submit a GitHub Issue.
For further support with the Graphlit Platform, please join our Discord community.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Slack, Discord, 웹사이트, Google Drive, 이메일, Jira, Linear 또는 GitHub에서 무엇이든 Graphlit 프로젝트로 수집한 다음 Cursor, Windsurf, Goose 또는 Cline과 같은 MCP 클라이언트에서 관련 지식을 검색하여 찾아볼 수 있습니다.
문서(PDF, DOCX, PPTX 등)와 HTML 웹 페이지는 처리 시 마크다운으로 추출됩니다. 오디오 및 비디오 파일은 처리 시 텍스트로 변환됩니다.
웹 크롤링과 웹 검색은 MCP 도구에 내장되어 있으므로 다른 도구를 별도로 통합할 필요가 없습니다.
Related Resources
Related MCP Servers
- -securityAlicense-qualityA tool for Model Context Protocol (MCP) that allows you to analyze web content and add it to your knowledge base, storing content as Markdown files for easy viewing with tools like Obsidian.Last updated -9MIT License
- -securityAlicense-qualityA Python-based MCP server that crawls websites to extract and save content as markdown files, with features for mapping website structure and links.Last updated -3MIT License
- -securityFlicense-qualityBridge the gap between your web crawl and AI language models. With mcp-server-webcrawl, your AI client filters and analyzes web content under your direction or autonomously, extracting insights from your web content. Supports WARC, wget, InterroBot, Katana, and SiteOne crawlers.Last updated -21Python
- -securityAlicense-qualityToolset that crawls websites, generates Markdown documentation, and makes that documentation searchable via a Model Context Protocol (MCP) server for integration with tools like Cursor.Last updated -24MIT License