Skip to main content
Glama
NZKea

akahu-mcp

by NZKea

list_transactions

List transactions for a bank account using cached data from the last 90 days, refreshed daily. Filter by date range, limit results, and optionally force a cache refresh.

Instructions

List transactions for a bank account, served from a local cache that keeps the last ~90 days. The cache is refreshed at most once per 24h (Akahu Personal only refreshes upstream daily); pass force=True to bypass the TTL.

Args: account: account id or fuzzy substring match against account name start: ISO date (YYYY-MM-DD), inclusive lower bound on transaction date end: ISO date (YYYY-MM-DD), inclusive upper bound limit: max rows to return (default 100, newest first) force: bypass the 24h cache TTL

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
accountYes
startNo
endNo
limitNo
forceNo

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The 'list_transactions' MCP tool handler function. It takes account (id or name), optional start/end dates, limit, and force flag. It resolves the account, ensures transactions are fresh in cache, queries cached transactions, and returns the results.
    @mcp.tool()
    async def list_transactions(
        account: str,
        start: str | None = None,
        end: str | None = None,
        limit: int = 100,
        force: bool = False,
    ) -> dict[str, Any]:
        """List transactions for a bank account, served from a local cache that
        keeps the last ~90 days. The cache is refreshed at most once per 24h
        (Akahu Personal only refreshes upstream daily); pass force=True to bypass
        the TTL.
    
        Args:
            account: account id or fuzzy substring match against account name
            start: ISO date (YYYY-MM-DD), inclusive lower bound on transaction date
            end: ISO date (YYYY-MM-DD), inclusive upper bound
            limit: max rows to return (default 100, newest first)
            force: bypass the 24h cache TTL
        """
        accounts = await sync.ensure_accounts_fresh(force=force)
        acc = _find_account(accounts, account)
        if acc is None:
            return {"error": f"No account matching {account!r}"}
    
        fetched = await sync.ensure_transactions_fresh(acc["_id"], force=force)
        rows = cache.get_transactions_cached(acc["_id"], start=start, end=end, limit=limit)
        return {
            "account_id": acc["_id"],
            "account_name": acc.get("name"),
            "fetched_from_akahu": fetched,
            "returned": len(rows),
            "transactions": rows,
        }
  • The tool is registered via the @mcp.tool() decorator on line 91, which registers 'list_transactions' as an MCP tool with FastMCP.
    @mcp.tool()
  • Function signature and docstring define the schema: account (str, required), start/end (optional ISO date strings), limit (int, default 100), force (bool, default False).
    async def list_transactions(
        account: str,
        start: str | None = None,
        end: str | None = None,
        limit: int = 100,
        force: bool = False,
    ) -> dict[str, Any]:
        """List transactions for a bank account, served from a local cache that
        keeps the last ~90 days. The cache is refreshed at most once per 24h
        (Akahu Personal only refreshes upstream daily); pass force=True to bypass
        the TTL.
    
        Args:
            account: account id or fuzzy substring match against account name
            start: ISO date (YYYY-MM-DD), inclusive lower bound on transaction date
            end: ISO date (YYYY-MM-DD), inclusive upper bound
            limit: max rows to return (default 100, newest first)
            force: bypass the 24h cache TTL
        """
  • Helper function 'ensure_transactions_fresh' that handles cache TTL logic and calls the Akahu API to fetch transactions, then stores them in the local SQLite cache.
    async def ensure_transactions_fresh(
        account_id: str,
        lookback_days: int = DEFAULT_LOOKBACK_DAYS,
        force: bool = False,
    ) -> int:
        """Refresh transactions for one account. Returns the number of rows fetched
        from Akahu (which is upserted into the cache; many may already exist)."""
        cache.init_db()
        state = cache.get_sync_state(account_id)
        now = _now()
    
        if not force and state and (now - state["last_synced_at"]) < DEFAULT_TTL_SECONDS:
            logger.info(
                "Skipping refresh for %s — last synced %ds ago",
                account_id,
                now - state["last_synced_at"],
            )
            return 0
    
        end_dt = datetime.now(timezone.utc)
        if state is None:
            start_dt = end_dt - timedelta(days=lookback_days)
        else:
            # Resync from `last_synced_at` minus a small overlap so late-settling
            # transactions are picked up; INSERT OR REPLACE deduplicates.
            start_dt = datetime.fromtimestamp(
                state["last_synced_at"], tz=timezone.utc
            ) - timedelta(days=OVERLAP_DAYS)
    
        start_iso = _iso_z(start_dt)
        end_iso = _iso_z(end_dt)
        logger.info("Fetching transactions for %s from %s to %s", account_id, start_iso, end_iso)
    
        client = AkahuClient()
        txns = await client.get_transactions(account_id, start=start_iso, end=end_iso)
        cache.put_transactions(account_id, txns)
    
        oldest_iso = start_dt.date().isoformat()
        cache.set_sync_state(account_id, now, oldest_iso)
        return len(txns)
  • Helper function 'get_transactions_cached' that queries the local SQLite cache for transactions by account_id, with optional start/end date filtering and limit.
    def get_transactions_cached(
        account_id: str,
        start: str | None = None,
        end: str | None = None,
        limit: int | None = None,
    ) -> list[dict[str, Any]]:
        sql = (
            "SELECT id, account_id, date, amount, description, merchant_name "
            "FROM transactions WHERE account_id = ?"
        )
        params: list[Any] = [account_id]
        if start:
            sql += " AND date >= ?"
            params.append(start)
        if end:
            sql += " AND date <= ?"
            params.append(end)
        sql += " ORDER BY date DESC"
        if limit:
            sql += f" LIMIT {int(limit)}"
        with _connect() as conn:
            rows = conn.execute(sql, params).fetchall()
        return [dict(r) for r in rows]
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations, the description fully carries the burden. It discloses caching (last ~90 days, 24h refresh), force parameter effect, and default limit order. It does not cover error handling or edge cases, but the output schema exists for return format.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is structured with the main sentence followed by bullet-like Args. It is slightly verbose but each sentence contributes essential information. The purpose is front-loaded.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the 5 parameters and output schema, the description covers caching, date range, limit, and force flag. It lacks mention of error handling or account not found, but is largely complete for a list tool.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 0%, but the description thoroughly explains all five parameters: account (fuzzy match), start/end (ISO dates), limit (max rows, default 100), force (bypass cache). This adds substantial value beyond the bare schema.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action and resource: 'List transactions for a bank account'. The sibling tools (get_share_holdings, list_accounts) deal with distinct resources, eliminating confusion.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description explains caching behavior and the force parameter to bypass TTL, giving context on when to use this tool. It does not explicitly exclude alternative tools, but the resource difference makes it clear.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NZKea/akahu-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server