Monitors real-time signals from Docker Hub to track the distribution and adoption metrics of AI project containers.
Collects real-time repository statistics, release summaries, and star momentum to track the development state and popularity of open-source AI projects.
Pulls package download data and release information from the npm registry to compute momentum, hype ratios, and project tiers.
Pulls package download data and release information from PyPI to compute momentum, hype ratios, and project tiers for Python-based AI projects.
Ingests discussions from V2EX to monitor community feedback and track real-time signals regarding AI development trends.
PT-Edge — AI Project Intelligence
PT-Edge is an MCP server that makes AI assistants less wrong about the current state of AI development. It tracks 300+ open-source AI projects across major labs, collecting real-time signals from GitHub, PyPI, npm, HuggingFace, Docker Hub, and Hacker News.
Built by — a newsletter covering the engineering side of AI.
What It Does
Daily ingests pull GitHub stats, package downloads, releases, HN posts, V2EX discussions, and newsletter coverage
Materialized views compute derived metrics: momentum, hype ratio, tiers, lifecycle stage
LLM-powered enrichment — Claude Haiku summarises releases and newsletter topics; OpenAI embeds everything for semantic search
30+ MCP tools let you query this data naturally in conversation
Community feedback system — corrections, article pitches, and lab event tracking
Available Tools
Category | Tools |
Discovery |
|
Deep Dives |
|
Comparison |
|
Project Discovery |
|
Community |
|
Lab Intelligence |
|
Methodology |
|
Power User |
|
Key Concepts
Hype Ratio — stars / monthly downloads. High = GitHub tourism. Low = invisible infrastructure.
Tiers — T1 Foundational (>10M downloads), T2 Major (>100K), T3 Notable (>10K), T4 Emerging
Lifecycle — emerging → launching → growing → established → fading → dormant
Momentum — star and download deltas over 7-day and 30-day windows
Connecting
PT-Edge uses the MCP Streamable HTTP transport. Connect via:
https://mcp.phasetransitions.ai/mcp?token=YOUR_TOKENWorks with Claude Desktop, Claude.ai (web connector), and any MCP-compatible client.
Stack
Runtime: Python 3.11, FastAPI, FastMCP
Database: PostgreSQL 16 with pgvector
Embeddings: OpenAI text-embedding-3-large (1536 dimensions)
LLM: Claude Haiku 4.5 (release + newsletter summarisation)
Hosting: Render (web service + cron + managed Postgres)
Development
# Clone and set up
git clone https://github.com/grahamrowe82/pt-edge.git
cd pt-edge
cp .env.example .env # Add your API keys
# Start database
docker compose up -d
# Run migrations
python -m app.migrations.run
# Start server
uvicorn app.main:app --reload
# Run daily ingest
python scripts/ingest_all.pyLicense
MIT — see LICENSE.