Integrations
Powers the web server that connects to the MCP server and handles client requests from the htmx UI.
Provides a lightweight frontend interface for the chatbot, allowing users to interact with the real estate data analysis tools.
Templates webpages that are returned to the htmx frontend, enabling dynamic content generation for the chat interface.
The Motivation
How does an LLM interact with its environment in your systems? It doesn't. It's a call and response API. I keep hearing Agents this, Agents that, but it seems to me that without access to tools that allow them to observe or manipulate their environment, Agents are fundamentally limited in performing the kinds of meaningful tasks we care about.
The first ten times I see a new buzzword I try my best to ignore the noise, but after Model Context Protocol (MCP) kept popping up I read into it. By presenting a consistent format for advertising the existence of tools and resources and how to use them, it neatly solves the problem. When you need to build a new connector for an LLM to a tool or database, you can find a prebuilt open source MCP server and drop it in. MCP also has a corporate sponsor that maintains it - Anthropic, one of the key industry players - so it was dogfood-ed (dogfed?) since day one and built to last.
The Plan
I like to teach myself new things by solving real problems - I find the results stick in my head better that way. I've done a ton of Retrieval Augmented Generation (RAG) work and I'm prepping a lot of it for open sourcing, and I see MCP as the next evolution in RAG architectures. I think in the near future, if you're doing RAG well, you're doing it on top of MCP.
I've been learning about real estate investment lately, in anticipation of a market correction some time in the next year or two. I'm going to load some free Zillow data into a database, and then I'm going to run an MCP server on it. Then, I'm going to run a FastAPI server that connects to the MCP server, and serves requests from an htmx UI. The end goal is a chat bot that can help me explore this real estate data and make some decisions about what markets to look for deals in.
The data I'm going to be loading includes:
- Zillow Home Value Index (ZHVI) – A measure of the typical home value for a given region and housing type within the 65th to 95th precentile.
- Zillow Home Value Forecast (ZHVF) – A forward-looking estimate of home value trends, based on current and historical market data.
- Zillow Observed Rent Index (ZORI) – A smoothed measure of typical market rents observed across current and off market rental listings.
- Zillow Observed Rent Forecast (ZORF) – Projected changes in rent prices based on trends in the ZORI and market indicators.
You may notice that I'm excluding the ZHVI data and that's due to lack of data - they only recently started to track this.
This is very domain specific data - you may not want to make guesses about the data without reading the definitions - so I think it'll make for a great example as the LLM won't be able to intuit its way through a trivial set of problems.
I want to ask the LLM to help me find a market with a balance of affordable houses with rents that cash flow with great projected future growth rates. I don't want to take these projected growth rates for granted either - let's do a little programming to see how often and in which area they tend to be correct. I expect this to take a lot of prompt engineering to craft different queries. Then, I'll be able to store these prompts as MCP prompts to suggest to the chatbot and execute against the database.
I'm choosing sqlite as my database, since it stores memory in a local file, so there isn't any reloading or bringing the db up/down. I'm going to be running the database in a container for portability so preserving the database state in a file means that I can treat it as a "cached" data load if I want to. This way I don't need to reload the Zillow data into the database every time I restart the container if I mount to a local volume.
In order to be able to talk with this data I know I’ll need to write some kind of MCP server which can run queries against the tables we just wrote. I think I’ll want to encode useful queries as either MCP prompts or tools which are then presented to the LLM behind the MCP client from what I’ve read so far, but I’m really not sure what that looks like in practice yet. I'm starting with this documentation from langchain blended with a few other things I've been reading as a starter server I can build from. Then I mostly followed the Anthropic documentation for writing the start of my MCP client, although I swapped much of their code for a langgraph ReAct agent. Then, I wasn't sure how to get the client and server to talk to each other so I found this really great MCP documentation which talks about the different transport methods built into MCP. I'm going with the default method which is to pass messages in-process between client and server using stdio. This will be a single container with a local database and the backend of an app inside so this makes sense, but for anything production grade I imagine the MCP server would be decoupled from the backend which would require the more complex HTTP streaming configuration. I'll leave that to when I have to build something like this for work.
Right now this is a single script that I’m executing for a single message generation. To make this useful, I need to attach a web server that takes in user queries and runs them through these functions. To make the web server useful, I’m going to need a frontend that looks like a chat interface to make requests against the server. Streamlit has been all the rage lately for AI projects like this, and I’ve tried it out in previous projects as well internal tooling for work, but I've concluded that running slow React code from slow Python code, requiring an entire additional chunk of server code you need to deploy for the Python, is not the best way to build lightweight frontends. No one wants to rewrite a proof of concept into something properly production-grade after it gets approved, so oftentimes that seems to just not happen. I've written more than my fair share of React which would remove the Python middleman here, but it really feels like overkill. Every time I build something fullstack I end up doing more frontend development than what I was actually trying to learn. HTMX has been on my radar for a while and I'm going to try it out for this project - it's ridiculously lightweight and provides the exact minimalist functionality that a low-logic frontend like a chat app requires. Also, the memes are elite. The only impact this has on my server choice is that I need to return templated HTML, so I’m going to be using jinja to template my webpages and send them with an HTMX script tag. That’s seriously all that you need to run it, it’s amazing.
Echo servers aren’t terribly interesting though, so let’s plug in our LLM agent. I added the MCP client into the FastAPI server’s lifespan, binding it to the app state so it can be passed around workers in a thread-safe way, not that the client/server are built with pooling. We just pass the message to the agent and format the output in a little html, and pass that back to the UI.
This server cannot be installed
An MCP server that connects to a SQLite database of Zillow real estate data, enabling users to explore property values, rent indexes, and forecasts through a chat interface to make informed investment decisions.
Related MCP Servers
- -securityFlicense-qualityAn MCP server that provides safe, read-only access to SQLite databases through MCP. This server is built with the FastMCP framework, which enables LLMs to explore and query SQLite databases with built-in safety features and query validation.Last updated -49Python
- -securityAlicense-qualityAn MCP server implementation that enables Claude AI to interact with Clickhouse databases. Features include secure database connections, query execution, read-only mode support, and multi-query capabilities.Last updated -PythonMIT License
- -securityFlicense-qualityAn MCP server that monitors and provides analytics on weekly report submissions in a Google Sheet, allowing users to check missing submissions, view statistics, and track individual reporting status.Last updated -Python
- AsecurityAlicenseAqualityAn open-source MCP server that connects to various data sources (SQL databases, CSV, Parquet files), allowing AI models to execute SQL queries and generate data visualizations for analytics and business intelligence.Last updated -1019PythonMIT License