Skip to main content
Glama
jobswithgpt

jobswithgpt

Official
by jobswithgpt

MCP Create Server

NPM version

A dynamic MCP server management service that creates, runs, and manages Model Context Protocol (MCP) servers dynamically. This service itself functions as an MCP server and launches/manages other MCP servers as child processes, enabling a flexible MCP ecosystem.

Key Features

...

Configuring Claude Desktop as an MCP Client

  1. If you have PRO account, you can directly add as hosted MCP connector (https://jobswithgpt.com/mcp)

  2. If you have a free account, you can add a proxy to the hosted MCP.

1. Prerequisites

  • Claude Desktop Free (installed)

  • Node.js ≥ 18 (for npx)

2. Create or Edit Claude Config

Locate (or create) the Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

  • Windows: %APPDATA%\Claude\claude_desktop_config.json


3. Add Local Proxy Definition

Insert this JSON:

{ "mcpServers": { "jobswithgpt": { "command": "npx", "args": [ "-y", "mcp-remote@latest", "https://jobswithgpt.com/mcp" ] } } }

4. Restart Claude Desktop

Quit Claude Desktop completely and reopen it.

Your new server jobswithgpt should appear in the paperclip menu under Tools.

OpenAI instructions

OpenAI can directly use the server hosted MCP server (https://jobswithgpt.com/mcp)

import asyncio from agents import Agent, Runner from agents.mcp.server import MCPServerStreamableHttp import json MCP_URL = "https://jobswithgpt.com/mcp" # your FastMCP streamable HTTP endpoint async def main(): async with MCPServerStreamableHttp(params={"url": MCP_URL}, name="jobswithgpt") as server: agent = Agent( name="jobs-mcp-local", mcp_servers=[server], instructions=( "Use the MCP server tools. First call location_autocomplete to get a geonameid " "for 'Seattle', then call search_jobs with keywords=['python'] and that geonameid." ), ) res = await Runner.run(agent, "Find machine learning jobs in san francisco.") print(res.final_output) if __name__ == "__main__": asyncio.run(main())

Example output

Here are some Python developer job opportunities in San Francisco: 1. Software Engineer - Backend, Product Engineering at Baton [Apply here](https://job-boards.greenhouse.io/baton/jobs/4011483007) 2. Senior Backend Engineer at Stellic [Apply here](https://job-boards.greenhouse.io/stellic/jobs/4705805007) 3. Software Engineer - Backend at Julius AI [Apply here](https://jobs.ashbyhq.com/julius/75f8ef44-4fa4-46fa-b416-c7b697078eca) etc

Related MCP server: MCP GPT Image 1

Local Job Search CLI

This repository also ships a small uv-managed Python CLI that chats with the hosted MCP server to search for jobs directly from your terminal.

Requirements

  • Python 3.11+

  • uv

Install dependencies

uv sync # creates a virtual environment in .venv and installs httpx

Run the chat interface

uv run jobs-mcp

The CLI:

  • opens an MCP session against https://jobswithgpt.com/mcp

  • talks to the location_autocomplete and search_jobs tools

  • remembers your previous session so you can issue multiple job searches in one run

While running you can type:

  • natural language queries such as python jobs in Seattle

  • schema to print the MCP tool descriptions

  • quit / exit to leave

Use uv run jobs-mcp --help to see optional flags (custom endpoint, number of location suggestions, etc.).

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jobswithgpt/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server