Skip to main content
Glama

SQLite MCP Server

by bhargava2019
README.markdown7.17 kB
# SQLite MCP Server and Client This project implements an MCP (Model Context Protocol) server and client for interacting with a SQLite database. The server provides tools to query an existing SQLite database, while the client, built with Chainlit and LangGraph, allows users to send natural language prompts to perform database operations via a web-based chat interface. ## Project Overview - **MCP Server (`mcp_sqlite_server.py`)**: A server using the `FastMCP` library that exposes four tools to interact with a SQLite database (`example.db`): - `list_tables`: Lists all tables in the database. - `get_table_schema`: Retrieves column names and data types for a specified table. - `count_rows`: Counts the number of rows in a specified table. - `execute_query`: Executes read-only SQL SELECT queries and returns results. - **MCP Client (`mcp_client_chainlit.py`)**: A Chainlit-based client using LangGraph's `create_react_agent` and the `langchain-mcp-adapters` library to connect to the server. Users can input natural language prompts (e.g., "List all tables" or "Show me the first 5 rows of the users table"), and the agent dynamically calls the server's tools to generate responses. - **Transport**: The server uses `streamable-http` transport (HTTP with Server-Sent Events) for JSON-RPC 2.0 communication, running at `http://localhost:8080/mcp`. - **Security**: The server restricts queries to SELECT statements and validates table names to prevent SQL injection. ## Prerequisites - **Python**: Version 3.8 or higher. - **SQLite Database**: An existing `example.db` file in the project directory with data (e.g., tables like `users` or `orders`). - **API Key**: An Anthropic API key for the client (or another LLM provider like OpenAI if preferred). ## Dependencies The project requires the following Python packages, listed in `requirements.txt`: ``` mcp==0.1.0 loguru==0.7.2 chainlit==1.1.0 langgraph==0.2.14 langchain==0.2.16 langchain-mcp-adapters==0.1.0 langchain-anthropic==0.1.23 ``` ## Setup Instructions 1. **Clone the Repository** (if applicable): ```bash git clone <repository-url> cd <repository-directory> ``` 2. **Create and Activate a Virtual Environment**: ```bash python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate ``` 3. **Install Dependencies**: ```bash pip install -r requirements.txt ``` 4. **Set Environment Variables**: - Configure the Anthropic API key for the client: ```bash export ANTHROPIC_API_KEY=your_anthropic_api_key ``` - If using another LLM (e.g., OpenAI), set the appropriate API key: ```bash export OPENAI_API_KEY=your_openai_api_key ``` and update `mcp_client_chainlit.py` to use `langchain-openai` instead of `langchain-anthropic`. 5. **Prepare the SQLite Database**: - Ensure an `example.db` file exists in the project directory with the desired tables and data. - The server does not initialize or modify the database; it assumes the database is pre-populated. ## Running the Project 1. **Start the MCP Server**: - Run the server script to expose the SQLite database tools: ```bash python mcp_sqlite_server.py ``` - The server will start at `http://localhost:8080/mcp` using `streamable-http` transport. - Verify it’s running by checking the terminal for log messages (e.g., "Starting MCP server with streamable-http transport..."). 2. **Start the Chainlit Client**: - Run the client script to launch the web-based chat interface: ```bash chainlit run mcp_client_chainlit.py ``` - Open a browser and navigate to `http://localhost:8001` (or the port shown in the terminal). - The Chainlit UI will display a welcome message: "Welcome to the SQLite MCP Client! Ask about the database..." ## Usage 1. **Interact via Chainlit UI**: - In the Chainlit interface (`http://localhost:8001`), enter natural language prompts to query the database. - Example prompts: - "List all tables in the database" → `["users", "orders"]` - "What is the schema of the users table?" → `[{"column_name": "id", "data_type": "INTEGER"}, {"column_name": "name", "data_type": "TEXT"}]` - "How many rows are in the orders table?" → `42` (example) - "Run SELECT name FROM users WHERE id = 1" → `Results: name = Alice` - The LangGraph agent will use the MCP server's tools to process the prompt and return a formatted response. 2. **Expected Behavior**: - The client sends prompts to the LangGraph agent, which decides which tools to call (e.g., `list_tables`, `execute_query`). - The `MultiServerMCPClient` communicates with the server at `http://localhost:8080/mcp` to execute tool requests. - Responses are displayed in the Chainlit chat interface. ## Example Interactions - **Prompt**: "Show the schema and row count for the users table." - **Agent Process**: 1. Calls `get_table_schema("users")` → `[{"column_name": "id", "data_type": "INTEGER"}, {"column_name": "name", "data_type": "TEXT"}]` 2. Calls `count_rows("users")` → `10` (example) 3. Response: "The users table has columns: id (INTEGER), name (TEXT). It contains 10 rows." - **Prompt**: "Get all names from the users table." - **Agent Process**: 1. Calls `execute_query("SELECT name FROM users")` → `{"columns": ["name"], "rows": [["Alice"], ["Bob"]]}` (example) 2. Response: "Names in the users table: Alice, Bob" ## Troubleshooting - **Server Not Running**: Ensure `mcp_sqlite_server.py` is running and accessible at `http://localhost:8080/mcp`. Check terminal logs for errors. - **Client Errors**: - If the Chainlit UI fails to load, verify the port (`8001`) is not in use. - If the agent fails to connect to the server, check the `MCP_SERVER_CONFIG` URL in `mcp_client_chainlit.py`. - **LLM Issues**: Ensure the Anthropic API key is valid. For alternative LLMs, install the appropriate package (e.g., `langchain-openai`) and update the client script. - **Database Errors**: Confirm `example.db` exists and is readable. Check server logs (`loguru`) for SQL errors. ## Notes - **Security**: The server restricts queries to SELECT statements and validates table names to prevent SQL injection. For production, consider adding OAuth 2.1 authentication. - **Extensibility**: Add more tools to the server by defining new `@mcp.tool()` functions in `mcp_sqlite_server.py`. The client will automatically discover them via `MultiServerMCPClient`. - **LLM Flexibility**: The client uses Anthropic's Claude by default. To use another LLM (e.g., OpenAI's GPT-4o), replace `langchain-anthropic` with `langchain-openai` in `requirements.txt` and update the LLM import in `mcp_client_chainlit.py`. - **Dependencies**: Check for version compatibility in `requirements.txt`. Update versions if newer releases are available. ## Contributing Contributions are welcome! To add features or fix bugs: 1. Fork the repository. 2. Create a branch (`git checkout -b feature/your-feature`). 3. Commit changes (`git commit -m "Add your feature"`). 4. Push to the branch (`git push origin feature/your-feature`). 5. Open a pull request.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bhargava2019/fastmcp-sql-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server