Skip to main content
Glama

LinkedIn MCP Server

by udaykakade25

LinkedIn MCP Server

📖 Overview

LinkedIn MCP Server is a Model Context Protocol (MCP) implementation that bridges language models and other applications with LinkedIn's API. It provides a standardized interface for executing LinkedIn operations through various tools defined by the MCP standard.

🚀 Features

This server provides the following capabilities through MCP tools:

ToolDescription
get_profile_infoRetrieve LinkedIn profile information (current user or specified person)
create_text_postCreate a text post on LinkedIn with customizable visibility
create_article_postCreate an article post with title and content
get_user_postsRetrieve recent posts from a user's profile
get_network_updatesGet network updates from LinkedIn feed
search_peopleSearch for people on LinkedIn
get_company_infoRetrieve information about a LinkedIn company

🔧 Prerequisites

You'll need one of the following:

  • Docker: Docker installed and running (recommended)
  • Python: Python 3.12+ with pip

⚙️ Setup & Configuration

LinkedIn App Setup

  1. Create a LinkedIn App:
    • Visit the LinkedIn Developer Portal
    • Create a new application and add it to your developer account
    • Under the "Auth" section, configure the following scopes:
      • r_liteprofile (for basic profile access)
      • w_member_social (for posting content)
    • Copy your Client ID and Client Secret
  2. Generate Access Token:
    • Use LinkedIn's OAuth2 authorization code flow
    • Navigate to OAuth2 > URL Generator in the LinkedIn Developer Portal
    • Generate an access token with the required scopes
    • For testing, you can use the temporary access token provided in the developer console

    Environment Configuration

  3. Create your environment file:
    cp .env.example .env
  4. Edit the .env file with your LinkedIn credentials:
    LINKEDIN_ACCESS_TOKEN=YOUR_ACTUAL_LINKEDIN_ACCESS_TOKEN LINKEDIN_MCP_SERVER_PORT=5000

    🏃‍♂️ Running the Server

The Docker build must be run from the project root directory (klavis/):

# Navigate to the root directory of the project cd /path/to/klavis # Build the Docker image docker build -t linkedin-mcp-server -f mcp_servers/linkedin/Dockerfile . # Run the container docker run -d -p 5000:5000 --name linkedin-mcp linkedin-mcp-server

To use your local .env file instead of building it into the image:

docker run -d -p 5000:5000 --env-file mcp_servers/linkedin/.env --name linkedin-mcp linkedin-mcp-server

Option 2: Python Virtual Environment

# Navigate to the LinkedIn server directory cd mcp_servers/linkedin # Create and activate virtual environment python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate # Install dependencies pip install -r requirements.txt # Run the server python server.py

Once running, the server will be accessible at http://localhost:5000.

🔌 API Usage

The server implements the Model Context Protocol (MCP) standard. Here's an example of how to call a tool:

import httpx async def call_linkedin_tool(): url = "http://localhost:5000/mcp" payload = { "tool_name": "linkedin_create_text_post", "tool_args": { "text": "Hello from LinkedIn MCP Server!", "visibility": "PUBLIC" } } async with httpx.AsyncClient() as client: response = await client.post(url, json=payload) result = response.json() return result

📋 Common Operations

Getting Profile Information

payload = { "tool_name": "linkedin_get_profile_info", "tool_args": {} # Empty for current user, or provide person_id }

Creating a Text Post

payload = { "tool_name": "linkedin_create_text_post", "tool_args": { "text": "Excited to share my latest project!", "visibility": "PUBLIC" } }

Creating an Article Post

payload = { "tool_name": "linkedin_create_article_post", "tool_args": { "title": "The Future of AI", "text": "In this article, I explore the latest trends in artificial intelligence...", "visibility": "PUBLIC" } }

Searching for People

payload = { "tool_name": "linkedin_search_people", "tool_args": { "keywords": "software engineer", "count": 10 } }

🛠️ Troubleshooting

Docker Build Issues

  • File Not Found Errors: If you see errors like failed to compute cache key: failed to calculate checksum of ref: not found, this means Docker can't find the files referenced in the Dockerfile. Make sure you're building from the root project directory (klavis/), not from the server directory.

Common Runtime Issues

  • Authentication Failures: Verify your access token is correct and hasn't expired. LinkedIn access tokens typically have a short lifespan.
  • API Errors: Check LinkedIn API documentation for error meanings and status codes.
  • Missing Permissions: Ensure your LinkedIn app has the necessary scopes enabled (r_liteprofile, w_member_social).
  • Rate Limiting: LinkedIn has strict rate limits. Implement appropriate delays between requests if needed.
  • Scope Issues: Some endpoints require additional permissions or LinkedIn partnership status.

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📜 License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

A Model Context Protocol implementation that bridges language models with LinkedIn's API, enabling profile access, posting content, searching people, and retrieving company information through standardized tools.

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that enables seamless interaction with LinkedIn for job applications, profile retrieval, feed browsing, and resume analysis through natural language commands.
    Last updated -
    19
    Python
  • -
    security
    F
    license
    -
    quality
    A server implementing the Model Context Protocol that enables users to retrieve LinkedIn profile information and activity data via EnrichB2B API, and generate text using OpenAI GPT-4 or Anthropic Claude models.
    Last updated -
    1
    Python
    • Linux
  • A
    security
    A
    license
    A
    quality
    Enables AI assistants to interact with LinkedIn data through the Model Context Protocol, allowing profile searches, job discovery, messaging, and network analytics.
    Last updated -
    28
    6
    11
    TypeScript
    MIT License
    • Apple
  • A
    security
    F
    license
    A
    quality
    An unofficial Model Context Protocol server that enables programmatic access to LinkedIn data through tools like user search, company search, profile enrichment, and contact retrieval.
    Last updated -
    7
    419
    2
    TypeScript

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/udaykakade25/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server