Integrations
Supports loading Kaggle API credentials from environment variables stored in a .env file.
Enables running the MCP server in a containerized environment with Docker, maintaining Kaggle API credentials through environment variables.
Provides tools for searching and downloading Kaggle datasets, and generating prompts for exploratory data analysis (EDA) notebooks on specific datasets.
Kaggle MCP (Model Context Protocol) Server
This repository contains an MCP (Model Context Protocol) server (server.py
) built using the fastmcp
library. It interacts with the Kaggle API to provide tools for searching and downloading datasets, and a prompt for generating EDA notebooks.
Project Structure
server.py
: The FastMCP server application. It defines resources, tools, and prompts for interacting with Kaggle..env.example
: An example file for environment variables (Kaggle API credentials). Rename to.env
and fill in your details.requirements.txt
: Lists the necessary Python packages.pyproject.toml
&uv.lock
: Project metadata and locked dependencies foruv
package manager.datasets/
: Default directory where downloaded Kaggle datasets will be stored.
Setup
- Clone the repository:Copy
- Create a virtual environment (recommended):Copy
- Install dependencies:
Using pip:Or using uv:CopyCopy
- Set up Kaggle API credentials:
- Method 1 (Recommended): Environment Variables
- Create
.env
file - Open the
.env
file and add your Kaggle username and API key:Copy - You can obtain your API key from your Kaggle account page (
Account
>API
>Create New API Token
). This will download akaggle.json
file containing your username and key.
- Create
- Method 2:
kaggle.json
file- Download your
kaggle.json
file from your Kaggle account. - Place the
kaggle.json
file in the expected location (usually~/.kaggle/kaggle.json
on Linux/macOS orC:\Users\<Your User Name>\.kaggle\kaggle.json
on Windows). Thekaggle
library will automatically detect this file if the environment variables are not set.
- Download your
- Method 1 (Recommended): Environment Variables
Running the Server
- Ensure your virtual environment is active.
- Run the MCP server:The server will start and register its resources, tools, and prompts. You can interact with it using an MCP client or compatible tools.Copy
Running the Docker Container
1. Set up Kaggle API credentials
This project requires Kaggle API credentials to access Kaggle datasets.
- Go to https://www.kaggle.com/settings and click "Create New API Token" to download your
kaggle.json
file. - Open the
kaggle.json
file and copy your username and key into a new.env
file in the project root:
2. Build the Docker image
3. Run the Docker container using your .env file
This will automatically load your Kaggle credentials as environment variables inside the container.
Server Features
The server exposes the following capabilities through the Model Context Protocol:
Tools
search_kaggle_datasets(query: str)
:- Searches for datasets on Kaggle matching the provided query string.
- Returns a JSON list of the top 10 matching datasets with details like reference, title, download count, and last updated date.
download_kaggle_dataset(dataset_ref: str, download_path: str | None = None)
:- Downloads and unzips files for a specific Kaggle dataset.
dataset_ref
: The dataset identifier in the formatusername/dataset-slug
(e.g.,kaggle/titanic
).download_path
(Optional): Specifies where to download the dataset. If omitted, it defaults to./datasets/<dataset_slug>/
relative to the server script's location.
Prompts
generate_eda_notebook(dataset_ref: str)
:- Generates a prompt message suitable for an AI model (like Gemini) to create a basic Exploratory Data Analysis (EDA) notebook for the specified Kaggle dataset reference.
- The prompt asks for Python code covering data loading, missing value checks, visualizations, and basic statistics.
Connecting to Claude Desktop
Go to Claude > Settings > Developer > Edit Config > claude_desktop_config.json to include the following:
Usage Example
An AI agent or MCP client could interact with this server like this:
- Agent: "Search Kaggle for datasets about 'heart disease'"
- Server executes
search_kaggle_datasets(query='heart disease')
- Server executes
- Agent: "Download the dataset 'user/heart-disease-dataset'"
- Server executes
download_kaggle_dataset(dataset_ref='user/heart-disease-dataset')
- Server executes
- Agent: "Generate an EDA notebook prompt for 'user/heart-disease-dataset'"
- Server executes
generate_eda_notebook(dataset_ref='user/heart-disease-dataset')
- Server returns a structured prompt message.
- Server executes
- Agent: (Sends the prompt to a code-generating model) -> Receives EDA Python code.
You must be authenticated.
local-only server
The server can only run on the client's local machine because it depends on local resources.
It interacts with the Kaggle API to provide tools for searching and downloading datasets, and a prompt for generating EDA notebooks.
Related MCP Servers
- -securityAlicense-qualityAllows you to explore and manipulate kintone data using AI tools such as Claude Desktop!Last updated -9GoMIT License
- AsecurityAlicenseAqualityAllows the use of Kagi's API for web searching and content enrichment through methods like fastgpt, enrich/web, and enrich/news.Last updated -32PythonMIT License
- AsecurityAlicenseAqualityThis server facilitates interaction with Keboola's Storage API, enabling users to browse and manage project buckets, tables, and components efficiently through Claude Desktop.Last updated -715PythonMIT License
- AsecurityAlicenseAqualityEnables integration with Kibela API for searching and retrieving notes, allowing LLMs to interact with Kibela content seamlessly.Last updated -3354TypeScriptMIT License