# LinkedIn Lead Automation MCP Server
Production-grade LinkedIn Lead Automation MCP (Model Context Protocol) Server with real-time search, analysis, scoring, messaging, and automated follow-up sequences.
## Features
- 🔍 **Lead Discovery**: Search LinkedIn profiles by keywords, location, and filters
- 📊 **Profile Analysis**: Extract and analyze complete LinkedIn profile data
- 🎯 **AI-Powered Scoring**: Intelligent lead scoring (0-100) based on profile data
- 💬 **Message Generation**: Hyper-personalized message generation using AI
- 📨 **Automated Messaging**: Send connection requests and direct messages
- 🔄 **Follow-up Sequences**: Automated multi-stage follow-up campaigns
- 🔐 **API Key Management**: Secure tier-based access control
- 📈 **Usage Tracking**: Monitor API usage and enforce tier limits
- 🗄️ **PostgreSQL Support**: Built with Neon PostgreSQL for production use
## Architecture
- **MCP Server** (`src/index.js`): Stdio-based MCP protocol server
- **HTTP API** (`src/http-server.js`): RESTful HTTP API wrapper
- **Background Worker** (`src/worker.js`): Automated follow-up sequence processor
- **Database** (`src/database-pg.js`): PostgreSQL database layer
- **LinkedIn Automation** (`src/linkedin.js`): Chrome DevTools Protocol integration
- **AI Service** (`src/ai.js`): Anthropic Claude on Vertex AI (Google Cloud) integration for scoring and messaging
## Prerequisites
- Node.js 18+
- PostgreSQL (Neon or any PostgreSQL 14+)
- Chrome/Chromium browser with remote debugging enabled
- Google Cloud SDK with gcloud CLI (for Vertex AI authentication)
- GCP Project with Vertex AI API enabled
## Installation
```bash
# Clone the repository
git clone https://github.com/vikram-agentic/linkedin-mcp.git
cd linkedin-mcp
# Install dependencies
npm install
# Create .env file
cp .env.example .env
```
## Configuration
Create a `.env` file with the following variables:
```env
# Database (Neon PostgreSQL)
DATABASE_URL=postgresql://user:password@host/database?sslmode=require
# Google Cloud / Vertex AI Configuration
GCP_PROJECT_ID=amgn-app
GCP_LOCATION=global
ANTHROPIC_MODEL_ID=claude-sonnet-4-5
# Server Configuration
PORT=3001
# Chrome DevTools Protocol (optional, for browser automation)
CDP_URL=http://localhost:9222
```
## Database Setup
1. Create a Neon PostgreSQL database (or use any PostgreSQL 14+)
2. Run the schema in Neon SQL Editor:
```bash
# Use schema-neon.sql for Neon PostgreSQL
cat database/schema-neon.sql
```
Copy and paste the SQL from `database/schema-neon.sql` into Neon SQL Editor and execute it.
## Usage
### Start MCP Server (Stdio)
```bash
npm start
```
This starts the MCP server using stdio transport. Connect via MCP clients like Claude Desktop.
### Start HTTP API Server
```bash
npm run http
```
This starts the HTTP API server on port 3001 (or PORT from .env).
### Start Background Worker
```bash
npm run worker
```
This starts the automated follow-up sequence processor.
## API Endpoints
### Health Check
```
GET /health
```
### Generate API Key
```
POST /api/generate-key
Body: { "tier": "starter" | "professional" | "agency" | "enterprise" }
```
### Connect Browser
```
POST /api/browser/connect
Body: { "cdp_url": "http://localhost:9222" }
```
### Setup LinkedIn Session
```
POST /api/session/setup
Body: { "api_key": "...", "li_at_cookie": "..." }
```
### Search Leads
```
POST /api/leads/search
Body: { "api_key": "...", "keywords": "...", "location": "...", "limit": 25 }
```
### Analyze Profile
```
POST /api/leads/analyze
Body: { "api_key": "...", "profile_url": "..." }
```
### Score Lead
```
POST /api/leads/score
Body: { "api_key": "...", "profile_url": "..." }
```
### Generate Message
```
POST /api/messages/generate
Body: {
"api_key": "...",
"profile_url": "...",
"value_proposition": "...",
"message_type": "connection" | "direct"
}
```
### Send Message
```
POST /api/messages/send
Body: {
"api_key": "...",
"profile_url": "...",
"message": "...",
"is_connection_request": false
}
```
### Create Follow-up Sequence
```
POST /api/sequences/create
Body: {
"api_key": "...",
"profile_url": "...",
"initial_message": "...",
"num_followups": 3
}
```
### Get Leads
```
GET /api/leads?api_key=...
```
### Get Usage Stats
```
GET /api/usage?api_key=...
```
## MCP Tools
When using as an MCP server, the following tools are available:
- `connect_browser`: Connect to Chrome via CDP
- `setup_session`: Authenticate LinkedIn session
- `search_leads`: Search for LinkedIn leads
- `analyze_profile`: Extract profile data
- `score_lead`: AI-powered lead scoring
- `generate_message`: Generate personalized messages
- `send_message`: Send messages to profiles
- `create_followup_sequence`: Create automated sequences
- `generate_api_key`: Generate API keys
## Tier Limits
| Tier | Profiles | Messages | Sequences |
|------|----------|----------|-----------|
| Starter | 500/month | 200/month | 2 active |
| Professional | 2,000/month | 1,000/month | 10 active |
| Agency | 10,000/month | 5,000/month | Unlimited |
| Enterprise | Unlimited | Unlimited | Unlimited |
## Development
```bash
# Generate a test API key
npm run generate-key
# Run in development mode
npm start
```
## Production Deployment
### Deploy to Vercel
1. **Connect Repository to Vercel:**
```bash
# Install Vercel CLI
npm i -g vercel
# Login and deploy
vercel login
vercel --prod
```
2. **Set Environment Variables in Vercel Dashboard:**
- `DATABASE_URL`: Your Neon PostgreSQL connection string
- `GCP_PROJECT_ID`: Your Google Cloud project ID
- `GCP_LOCATION`: Location (default: `global`)
- `ANTHROPIC_MODEL_ID`: Model ID (default: `claude-sonnet-4-5`)
3. **For GCP Authentication:**
Since Vercel doesn't support `gcloud auth`, you have two options:
**Option A: Use Service Account (Recommended)**
- Create a GCP Service Account with Vertex AI permissions
- Download the JSON key file
- Convert to base64 and set as `GOOGLE_APPLICATION_CREDENTIALS` env var
- Update `src/ai.js` to use service account auth
**Option B: Use API Key (Alternative)**
- Generate a Vertex AI API key
- Set as `VERTEX_AI_API_KEY` environment variable
### Deploy with PM2 (Self-Hosted)
1. Set up PostgreSQL database (recommended: Neon)
2. Configure environment variables
3. Run database schema
4. Deploy using PM2:
```bash
pm2 start src/http-server.js --name linkedin-mcp-api
pm2 start src/worker.js --name linkedin-mcp-worker
```
## Security Notes
- ⚠️ **Never commit `.env` files** - they contain sensitive credentials
- 🔐 API keys are hashed using bcrypt
- 🔒 All database queries use parameterized statements
- 🛡️ CORS is configured for production use
## License
MIT License - see LICENSE file for details
## Author
Agentic AI AMRO Ltd
## Support
For issues and feature requests, please open an issue on GitHub.