Skip to main content
Glama

LinkedIn MCP Server

LinkedIn MCP Server with Anthropic Integration

A Python-based MCP (Model Context Protocol) server that gets stuff from your LinkedIn profile and integrates with the Anthropic API for potential analysis tasks. This project follows the src layout for Python packaging.

TL;DR Install for Claude Desktop Access to the LinkedIn profile

# 1.a) Install the mcp server access in Claude Desktop ./install_claude_desktop_mcp.sh # 1.b) or manually integrate this JSON snippet to the `mcpServers` section of your `claude_desktop_config.json` (e.g. `~/Library/Application\ Support/Claude/claude_desktop_config.json`) { "linkedin_francisco_perez_sorrosal": { "command": "npx", "args": ["mcp-remote", "http://localhost:10000/mcp"] } } # 2) Restart Claude and check that the 'Add from linkedin_francisco_perez_sorrosal` option is available in the mcp servers list # 3) Query the LinkedIn profile served from the mcp server in Claude Desktop! e.g. TODO

Features

  • Serves your LinkedIn profile from the project root
  • Built with FastAPI for high performance and with Pixi for dependency management and task running
  • Source code organized in the src/ directory
  • Includes configurations for:
    • Docker (optional, for containerization)
    • Linting (Ruff, Black, iSort)
    • Formatting
    • Type checking (MyPy)

Prerequisites

  • Python 3.11+
  • Pixi (for dependency management and task execution)
  • Docker (optional, for containerization)
  • Access to your LinkedIn profile

Project Structure

. ├── .dockerignore ├── .gitignore ├── Dockerfile ├── pyproject.toml # Python project metadata and dependencies (PEP 621) ├── README.md ├── src/ │ └── linkedin_mcp_server/ │ ├── __init__.py │ └── main.py # FastAPI application logic ├── tests/ # Test files (e.g., tests_main.py)

Setup and Installation

  1. Clone the repository (if applicable) or ensure you are in the project root directory.
  2. Install dependencies using Pixi:

This command will create a virtual environment and install all necessary dependencies:

pixi install

Running the Server

Pixi tasks are defined in pyproject.toml:

mcps (MCP Server)

pixi run mcps --transport stdio

Development Mode (with auto-reload)

# Using pixi directly pixi run mcps --transport stdio # or sse, streamable-http # Alternatively, using uv directly uv run --with "mcp[cli]" mcp run src/linkedin_mcp_server/main.py --transport streamable-http # Go to http://127.0.0.1:10000/mcp

The server will start at http://localhost:10000. It will automatically reload if you make changes to files in the src/ directory.

MCP Inspection Mode

# Using pixi DANGEROUSLY_OMIT_AUTH=true npx @modelcontextprotocol/inspector pixi run mcps --transport stdio # Direct execution DANGEROUSLY_OMIT_AUTH=true npx @modelcontextprotocol/inspector pixi run python src/linkedin_mcp_server/main.py --transport streamable-http

This starts the inspector for the MCP Server.

Web scrapper

pixi run python src/linkedin_mcp_server/web_scrapper.py

Development Tasks

Run Tests

pixi run test

Lint and Check Formatting

pixi run lint

Apply Formatting and Fix Lint Issues

pixi run format

Build the Package

Creates sdist and wheel in dist/:

pixi run build

Docker Support (Optional)

Build the Docker Image

docker build -t linkedin-mcp-server .

Run the Docker Container

TODO: Rewrite this if necessary. Docker support not yet done.

MCP Server Configuration

Local Configuration for Claude Desktop

{ "linkedin_francisco_perez_sorrosal": { "command": "uv", "args": [ "run", "--with", "mcp[cli]", "--with", "pymupdf4llm", "mcp", "run", "src/linkedin_mcp_server/main.py", "--transport", "streamable-http" ] } }

Remote Configuration for Claude Desktop

For connecting to a remote MCP server:

{ "linkedin_francisco_perez_sorrosal": { "command": "npx", "args": ["mcp-remote", "http://localhost:10000/mcp"] } }

Note: Update the host and port as needed for your deployment.

Currently I'm using render.com to host the MCP server. The configuration is in the config/claude.json file.

Render requires requirements.txt to be present in the root directory. You can generate it using:

uv pip compile pyproject.toml > requirements.txt

Also requires runtime.txt to be present in the root directory with the Python version specified:

python-3.11.11

Then you can query in Claude Desktop using the linkedin_mcp_fps MCP server to get info:

TODO

License

This project is licensed under the MIT License. See pyproject.toml (See LICENSE file) for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables Claude to access and analyze LinkedIn profile data through the Model Context Protocol, allowing users to query their LinkedIn information directly within Claude Desktop.

  1. TL;DR Install for Claude Desktop Access to the LinkedIn profile
    1. Features
    2. Prerequisites
    3. Project Structure
    4. Setup and Installation
    5. Running the Server
    6. Development Tasks
    7. Docker Support (Optional)
    8. MCP Server Configuration
    9. License

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A Model Context Protocol server that enables seamless interaction with LinkedIn for job applications, profile retrieval, feed browsing, and resume analysis through natural language commands.
    Last updated -
    10
    Python
  • -
    security
    A
    license
    -
    quality
    A powerful LinkedIn Profile Analyzer that seamlessly integrates with Claude AI to fetch and analyze public LinkedIn profiles, enabling users to extract, search, and analyze posts data through RapidAPI's LinkedIn Data API.
    Last updated -
    10
    Python
    MIT License
  • -
    security
    A
    license
    -
    quality
    Enables posting text and media content directly to LinkedIn from Claude Desktop with support for authentication and visibility controls.
    Last updated -
    1
    Python
    MIT License
  • -
    security
    F
    license
    -
    quality
    A server implementing the Model Context Protocol that enables users to retrieve LinkedIn profile information and activity data via EnrichB2B API, and generate text using OpenAI GPT-4 or Anthropic Claude models.
    Last updated -
    Python
    • Linux

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/francisco-perez-sorrosal/linkedin-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server