Backend service that periodically fetches the latest AI information from various sources
Storage for AI information data after processing by LLMs
Source for AI information from channels like Lex Fridman, Sequoia Capital, and other podcasts
Introduce
This open-source MCP program tracks premier global AI information, letting you easily access relevant AI updates in your daily LLM conversations without searching multiple channels like YouTube or X.
The project has two parts:
The MCP: Cloud-hostable or locally runnable, it uses the MCP protocol to interact with LLMs. A FastAPI backend: It periodically fetches the latest AI information from sources; this data, after light LLM processing, is stored in Supabase.
AI Surge Info List
X(Twitter)
- Andrej Karpathy @karpathy
- Ethan Mollick @emollick
- Dan Shipper @danshipper
- gabriel @GabrielPeterss4
- Justine Moore @venturetwins
- Peter Yang @peteryang
- GREG ISENBERG @gregisenberg
- Lenny Rachitsky @lennysan
Podcast & Youtube
- Acquired
- AI and I
- Behind the craft
- Founders
- Hallway chat
- Hard fork
- Sequoia Capital
- Lex Fridman
- Lightcone (YC)
- Greg Isenberg
- Lenny's Podcast
- Unsupervised Learning
Continue Updating
Welcome to continue adding high-quality AI information sources. We are still updating continuously. In the future, we will update the FastAPI backend program, please stay tuned.
This server cannot be installed
An open-source MCP server that tracks premier global AI information from sources like X and podcasts, enabling users to access relevant AI updates in their daily LLM conversations without searching multiple channels.
Related MCP Servers
- -securityFlicense-qualityAn MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.Last updated -Python
- -securityFlicense-qualityAn MCP server that fetches RSS feeds and shares them with LLMs, allowing AI assistants to access and present the latest news and articles from configured feeds.Last updated -1TypeScript
- AsecurityFlicenseAqualityA lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.Last updated -6411Python
- -securityFlicense-qualityAn OpenAI API-based MCP server that provides deep thinking and analysis capabilities, integrating with AI editor models to deliver comprehensive insights and practical solutions.Last updated -