Skip to main content
Glama
Omkar4141

Expense Tracker MCP Server

by Omkar4141

πŸ’° Expense Tracker using MCP (FastMCP + LangChain + Ollama)- Sample Project for understanding MCP

This project demonstrates a simple end-to-end MCP (Model Context Protocol) example where:

  • A FastMCP server exposes tools to manage expenses stored in SQLite

  • A LangChain client connects to the MCP server

  • An LLM (Llama 3.2 via Ollama) decides when to call tools

  • Natural language queries like

    "Add my expense 500 to groceries" automatically trigger backend database operations

πŸ“Œ Architecture Overview

User (CLI) β”‚ β–Ό LangChain Client (client.py) β”‚ β”‚ MCP (stdio) β–Ό FastMCP Server (main.py) β”‚ β–Ό SQLite Database (expenses.db)

Key Components

Component

Description

FastMCP

Exposes database operations as tools

LangChain MCP Adapter

Connects LLM to MCP tools

Ollama (Llama 3.2:3b)

Interprets user intent and calls tools

SQLite

Persistent expense storage


πŸ“‚ Project Structure

. β”œβ”€β”€ main.py # FastMCP expense database server β”œβ”€β”€ client.py # LangChain MCP client with LLM β”œβ”€β”€ expenses.db # SQLite database (auto-created) └── README.md

πŸš€ Features

  • βœ… Add expenses using natural language

  • βœ… View total expenses

  • βœ… List all expenses

  • βœ… Automatic tool selection by LLM

  • βœ… Persistent storage using SQLite

  • βœ… MCP-compliant architecture

πŸ› οΈ Tools Exposed by MCP Server

The FastMCP server exposes the following tools:

add_expense

Adds a new expense entry.

{ "amount": 500, "category": "groceries", "description": "weekly shopping" }

get_total

Returns the total sum of all expenses.

get_all_expenses

Returns a list of all recorded expenses.

βš™οΈ Prerequisites

Make sure you have the following installed:

  • Python 3.10+

  • Ollama

  • Llama 3.2 model

  • uv (Python package runner)

ollama pull llama3.2:3b

πŸ“¦ Install Dependencies

uv add fastmcp langchain langchain-mcp-adapters langchain-ollama

▢️ Running the Client

Update paths inside client.py:

"command": "/home/omkar/.local/bin/uv", "args": [ "run", "fastmcp", "run", "/full/path/to/main.py" ]

Then run:

uv run client.py

🧠 How It Works (Step-by-Step)

  1. User enters a natural language query

  2. LLM decides whether a tool is needed

  3. If required:

    • Tool name + arguments are generated

  4. LangChain invokes MCP tool

  5. Result is returned to LLM

  6. LLM generates final user-friendly respons

Just tell me πŸ‘

One-click Deploy
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Omkar4141/dbserver_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server