Berghain Events MCP Server

by alejofig
  • Linux
  • Apple

Integrations

  • Implements a FastAPI server that exposes Berghain events data as API endpoints and MCP tools for AI agents

  • Uses PydanticAI for agent implementation when testing the MCP server capabilities

  • Built on Python for server implementation, data processing, and API functionality

Berghain Events API & MCP Implementation

Project structure

berghain-api/ ├── app/ # Aplicación principal FastAPI │ ├── api/ # Rutas de la API │ ├── core/ # Configuración y componentes centrales │ ├── db/ # Capa de acceso a datos (DynamoDB) │ └── main.py # Punto de entrada de la aplicación y FastMCP server ├── scripts/ # Scripts para creación de tabla y carga de datos en DynamoDB │ ├── create_table.py │ └── load_data.py ├── events/ # Directorio para los JSON extraídos de Firecrawl ├── Dockerfile # Dockerfile para despliegue ├── requirements.txt # Dependencias del proyecto (para uv) └── README.md # Este archivo

Requirements

  • Python 3.10+
  • uv (for dependency management and virtual environment)
  • AWS Account (for DynamoDB and App Runner)
  • Firecrawl API Key

Facility

  1. Clone this repository:
    git clone <repository-url> cd berghain-api
  2. Create and activate a virtual environment with uv :
    uv venv source .venv/bin/activate # En Linux/macOS # .venv\\Scripts\\activate # En Windows
  3. Install dependencies with uv :
    uv pip install -r requirements.txt

Detailed Process

1. Data Extraction with Firecrawl

  • Configure the Firecrawl MCP in Cursor: Make sure you have your Firecrawl API Key. In the Cursor MCP settings, add:
    "firecrawl-mcp": { "command": "npx", "args": [ "-y", "firecrawl-mcp" ], "env": { "FIRECRAWL_API_KEY": "fc-YOUR_FIRECRAWL_API_KEY" } }
  • Run the extraction: Use an agent in Cursor (or a similar tool) to call the Firecrawl MCP and ask it to extract events from https://www.berghain.berlin/en/program/ .
  • Save the data: The extracted data should be saved as JSON files in the events/ directory. For example, events/berghain_events_YYYY-MM-DD.json .

2. Loading Data into DynamoDB

  • Create the table in DynamoDB: The scripts/create_table.py script takes care of this. Run it (adjust the parameters if necessary):
    uv run python scripts/create_table.py --table berghain_events --region tu-region-aws # Para desarrollo local con DynamoDB Local (ej. docker run -p 8000:8000 amazon/dynamodb-local): # uv run python scripts/create_table.py --table berghain_events --region localhost --endpoint-url http://localhost:8000
  • Load data into the table: The scripts/load_data.py script loads events from JSON files.
    uv run python scripts/load_data.py --table berghain_events --region tu-region-aws --path events # Para desarrollo local: # uv run python scripts/load_data.py --table berghain_events --region localhost --endpoint-url http://localhost:8000 --path events
    Make sure that app/core/config.py (imported by load_data.py ) has the necessary configurations if you don't pass them as arguments.

3. API with FastAPI and MCP Server

  • The API logic resides in the app/ directory, with the endpoints defined (e.g., in app/api/endpoints/events.py ).
  • The app/main.py file is configured to launch the FastAPI application and the FastMCP server, exposing the API endpoints as tools for AI models. Review the custom_maps in app/main.py to see how GET routes are mapped to RouteType.TOOL .

Deployment on AWS

a. Dockerfile

Make sure your Dockerfile is configured correctly to use uv and run app/main.py :

b. Build and Upload Image to Amazon ECR (Elastic Container Registry)

  1. Authenticate Docker with ECR:
    aws ecr get-login-password --region tu-region-aws | docker login --username AWS --password-stdin tu-aws-account-id.dkr.ecr.tu-region-aws.amazonaws.com
  2. Create a repository in ECR (if it doesn't exist):
    aws ecr create-repository --repository-name berghain-mcp-api --region tu-region-aws
  3. Build your Docker image:
    docker build -t berghain-mcp-api .
  4. Tag your image:
    docker tag berghain-mcp-api:latest tu-aws-account-id.dkr.ecr.tu-region-aws.amazonaws.com/berghain-mcp-api:latest
  5. Upload the image to ECR:
    docker push tu-aws-account-id.dkr.ecr.tu-region-aws.amazonaws.com/berghain-mcp-api:latest
    Replace tu-region-aws and tu-aws-account-id with your values.

c. Deploy Infrastructure with Terraform

  1. Prepare your Terraform files: Make sure you have your Terraform configuration files (e.g., main.tf , variables.tf , outputs.tf ) in a directory (e.g., terraform/ ). These files should define the necessary AWS resources, such as the AWS App Runner service that will use the ECR image, and the DynamoDB table (if it is also managed by Terraform). Your App Runner configuration in Terraform should reference the uploaded ECR image.
  2. Navigate to the Terraform directory:
    cd terraform
  3. Initialize Terraform:
    terraform init
  4. Apply the Terraform configuration:
    terraform apply
    Review the plan and confirm the application. Terraform will provision the resources.
  5. Get the service URL: Once applied, Terraform should display the defined outputs, including the App Runner service URL. Make a note of this URL (e.g., https://<id-servicio>.<region>.awsapprunner.com ).

Test the Deployed Solution

a. Adjust Local Test Script ( mcp_local.py )

Set the mcp_local.py file to the root of your project

Important: Update the mcp_server_url variable in mcp_local.py with the URL you got from the Terraform output.

b. Run Test

From the root of your project (or wherever you saved mcp_local.py ):

uv run python mcp_local.py

This will run the PydanticAI agent, which will attempt to connect to your deployed MCP and perform the query.

License

MIT

Related MCP Servers

  • A
    security
    F
    license
    A
    quality
    This server integrates with the Ticketmaster API to provide AI agents with real-time concert and event data, enabling dynamic fetching and formatting for ease of interpretation.
    Last updated -
    1
    Python
  • A
    security
    A
    license
    A
    quality
    An unofficial server that allows AI assistants to access Fathom Analytics data, enabling users to retrieve account information, site statistics, events, aggregated reports, and real-time visitor tracking.
    Last updated -
    5
    8
    1
    TypeScript
    MIT License
  • A
    security
    A
    license
    A
    quality
    This server provides tools for AI assistants to interact with the Eventbrite API, allowing users to search for events, get event details, retrieve venue information, and list event categories.
    Last updated -
    4
    2
    1
    JavaScript
    MIT License
    • Apple
  • -
    security
    A
    license
    -
    quality
    A Server-Sent Events implementation using FastAPI framework that integrates Model Context Protocol (MCP), allowing AI models to access external tools and data sources like weather information.
    Last updated -
    9
    Python
    MIT License

View all related MCP servers

ID: rhukrg7s0i