Skip to main content
Glama

MCP Toolbox for Databases

by googleapis
Apache 2.0
11,043
  • Linux
alloydb_ai_nl.ipynb39 kB
{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "fWKN6xBYgJRa" }, "source": [ "[![Open In\n", "Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/googleapis/genai-toolbox/blob/main/docs/en/samples/alloydb/ai-nl/alloydb_ai_nl.ipynb)\n", "\n", "# Objective\n", "\n", "In this notebook, we will be using the [AlloyDB AI NL\n", "tool](https://googleapis.github.io/genai-toolbox/resources/tools/alloydbainl/alloydb-ai-nl/)\n", "to generate and execute SQL queries on AlloyDB databases. We will start by\n", "setting up an AlloyDB database and then connecting to it using [MCP\n", "Toolbox](https://github.com/googleapis/genai-toolbox) and [ADK](https://google.github.io/adk-docs/). This allows us to leverage AlloyDB\n", "NL to SQL capabilities in agentic applications.\n", "\n", "For detailed information on the AlloyDB capability, please see [AlloyDB AI\n", "Natural Language Overview](https://cloud.google.com/alloydb/docs/ai/natural-language-overview).\n", "\n", "\n", "## Before you Begin\n", "\n", "Complete the before you begin setup from [Use AlloyDB AI natural language to generate SQL](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#before-you-begin\n", ")." ] }, { "cell_type": "markdown", "metadata": { "id": "3SQj5_HZECRa" }, "source": [ "## Step 1: Set up Database\n", "\n", "Here, we will set up a sample AlloyDB database for testing out our `alloydb-ai-nl` tool.\n", "### Authenticate to Google Cloud within Colab\n", "You need to authenticate as an IAM user so this notebook can access your Google Cloud Project. This access is necessary to use Google's LLM models.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "z1lNNl8B8tyP" }, "outputs": [], "source": [ "# Run this and allow access through the pop-up\n", "from google.colab import auth\n", "\n", "auth.authenticate_user()" ] }, { "cell_type": "markdown", "metadata": { "id": "yfg-u9Y4Mu_a" }, "source": [ "### Connect Your Google Cloud Project\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "u0Jc-0YNdhQd", "outputId": "be1275ea-dc5d-40e5-83c4-89320fb95d31" }, "outputs": [], "source": [ "# @markdown Please fill in the value below with your GCP project ID and then run the cell.\n", "\n", "# Please fill in these values.\n", "project_id = \"\" # @param {type:\"string\"}\n", "\n", "# Quick input validations.\n", "assert project_id, \"⚠️ Please provide a Google Cloud project ID\"\n", "\n", "# Configure gcloud.\n", "!gcloud config set project {project_id}" ] }, { "cell_type": "markdown", "metadata": { "id": "pWw3UBLFt9b5" }, "source": [ "### Set up AlloyDB\n" ] }, { "cell_type": "markdown", "metadata": { "id": "myJrfOKhhCI7" }, "source": [ "You will need a Postgres AlloyDB instance for the following stages of this notebook. Please set the following variables to connect to your instance or create a new instance.\n", "\n", "Please make sure that your instance has the required [access](https://cloud.google.com/alloydb/docs/ai/use-natural-language-generate-sql-queries#request-access).\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "dL6pXgbfuZRM" }, "outputs": [], "source": [ "# @markdown Please fill in the both the Google Cloud region and name of your AlloyDB instance. Once filled in, run the cell.\n", "\n", "# Please fill in these values.\n", "region = \"us-central1\" # @param {type:\"string\"}\n", "cluster_name = \"alloydb-ai-nl-testing\" # @param {type:\"string\"}\n", "instance_name = \"alloydb-ai-nl-testing-instance\" # @param {type:\"string\"}\n", "database_name = \"ai_nl_tool_testing\" # @param {type:\"string\"}\n", "password = input(\"Please provide a password to be used for 'postgres' database user: \")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "BhxzDSRauol7" }, "outputs": [], "source": [ "# Quick input validations.\n", "assert region, \"⚠️ Please provide a Google Cloud region\"\n", "assert instance_name, \"⚠️ Please provide the name of your instance\"\n", "assert database_name, \"⚠️ Please provide the name of your database_name\"\n", "assert cluster_name, \"⚠️ Please provide the name of your cluster_name\"\n", "assert password, \"⚠️ Please provide your password\"" ] }, { "cell_type": "markdown", "metadata": { "id": "TAo6OqyhhqQ6" }, "source": [ "### Connect to AlloyDB\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "O4OhoLSRE8g0", "outputId": "c3aee24a-47ca-4fef-e388-426a3defbab2" }, "outputs": [], "source": [ "%pip install \\\n", " google-cloud-alloydb-connector[asyncpg]==1.4.0 \\\n", " sqlalchemy==2.0.36 \\\n", " vertexai==1.70.0 \\\n", " asyncio==3.4.3 \\\n", " greenlet==3.1.1 \\\n", " --quiet" ] }, { "cell_type": "markdown", "metadata": { "id": "3_-yI6CTS5ij" }, "source": [ "This function will create a connection pool to your AlloyDB instance using the [AlloyDB Python connector](https://github.com/GoogleCloudPlatform/alloydb-python-connector). The AlloyDB Python connector will automatically create secure connections to your AlloyDB instance using mTLS." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Msm4IM70vHxB" }, "outputs": [], "source": [ "import asyncpg\n", "\n", "import sqlalchemy\n", "from sqlalchemy.ext.asyncio import AsyncEngine, create_async_engine\n", "\n", "from google.cloud.alloydb.connector import AsyncConnector, IPTypes\n", "\n", "async def init_connection_pool(connector: AsyncConnector, db_name: str, pool_size: int = 5) -> AsyncEngine:\n", " # initialize Connector object for connections to AlloyDB\n", " connection_string = f\"projects/{project_id}/locations/{region}/clusters/{cluster_name}/instances/{instance_name}\"\n", "\n", " async def getconn() -> asyncpg.Connection:\n", " conn: asyncpg.Connection = await connector.connect(\n", " connection_string,\n", " \"asyncpg\",\n", " user=\"postgres\",\n", " password=password,\n", " db=db_name,\n", " ip_type=IPTypes.PUBLIC,\n", " )\n", " return conn\n", "\n", " pool = create_async_engine(\n", " \"postgresql+asyncpg://\",\n", " async_creator=getconn,\n", " pool_size=pool_size,\n", " max_overflow=0,\n", " )\n", " return pool\n", "\n", "connector = AsyncConnector()" ] }, { "cell_type": "markdown", "metadata": { "id": "kOqRfZQjht3L" }, "source": [ "### Create a Database\n", "\n", "Next, you will create database to store the data using the connection pool. Enabling public IP takes a few minutes, you may get an error that there is no public IP address. Please wait and retry this step if you hit an error!" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "zypgoixYFI_9", "outputId": "72b062a2-9371-4541-8d3d-b02551536ae0" }, "outputs": [], "source": [ "from sqlalchemy import text, exc\n", "\n", "async def create_db(database_name, connector):\n", " pool = await init_connection_pool(connector, \"postgres\")\n", " async with pool.connect() as conn:\n", " # Check if the database already exists\n", " result = await conn.execute(text(f\"SELECT 1 FROM pg_database WHERE datname = '{database_name}'\"))\n", " if result.scalar_one_or_none() is None:\n", " await conn.execute(text(\"COMMIT\"))\n", " await conn.execute(text(f\"CREATE DATABASE {database_name}\"))\n", " print(f\"Database '{database_name}' created successfully\")\n", " else:\n", " print(f\"Database '{database_name}' already exists\")\n", " await pool.dispose()\n", "\n", "await create_db(database_name=database_name, connector=connector)" ] }, { "cell_type": "markdown", "metadata": { "id": "JbaXdRlsiwlD" }, "source": [ "### Set up your Database\n", "\n", "In the following steps, we set up our database to be ready to handle Natural\n", "Language Queries.\n", "1. **Create tables and populate data:** The provided schema and data are\n", "designed to support the fundamental operations of an online retail business,\n", "with potential applications extending to customer management, analytics,\n", "marketing, and operational aspects.\n", "\n", "1. **Configure Model Endpoint:** To use AlloyDB AI natural language, make sure that the [Vertex AI\n", "endpoint](https://cloud.google.com/alloydb/docs/ai/register-model-endpoint) is\n", "configured. Then you create a configuration and register a schema.\n", "`g_alloydb_ai_nl.g_create_configuration` creates the model.\n", "\n", "1. **Automated Context Generation:** To provide accurate answers to natural language questions, you use the AlloyDB AI natural language API to provide context about tables, views, and columns. You can use the automated context generation feature to produce context from tables and columns, and apply the context as ***COMMENTS*** attached to tables, views, and columns." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "eXl4egNtFknH" }, "outputs": [], "source": [ "setup_queries = [\n", " # Install required extension to use the AlloyDB AI natural language support API\n", " \"\"\"CREATE EXTENSION IF NOT EXISTS alloydb_ai_nl cascade;\"\"\",\n", " \"\"\"CREATE EXTENSION IF NOT EXISTS google_ml_integration; \"\"\",\n", " \n", " # Create schema\n", " \"\"\"CREATE SCHEMA IF NOT EXISTS nla_demo;\"\"\",\n", "\n", " # Create tables (If they do not exist)\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.addresses (\n", " address_id SERIAL PRIMARY KEY,\n", " street_address VARCHAR(255) NOT NULL,\n", " city VARCHAR(255) NOT NULL,\n", " country VARCHAR(255)\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.customers (\n", " customer_id SERIAL PRIMARY KEY,\n", " first_name VARCHAR(255) NOT NULL,\n", " last_name VARCHAR(255) NOT NULL,\n", " email VARCHAR(255) UNIQUE NOT NULL,\n", " address_id INTEGER REFERENCES nla_demo.addresses(address_id),\n", " date_of_birth DATE,\n", " created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.categories (\n", " category_id INTEGER PRIMARY KEY,\n", " category_name VARCHAR(255) UNIQUE NOT NULL\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.brands (\n", " brand_id INTEGER PRIMARY KEY,\n", " brand_name VARCHAR(255) NOT NULL\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.products (\n", " product_id INTEGER PRIMARY KEY,\n", " name VARCHAR(255) NOT NULL,\n", " description TEXT,\n", " brand_id INTEGER REFERENCES nla_demo.brands(brand_id),\n", " category_id INTEGER REFERENCES nla_demo.categories(category_id),\n", " created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.orders (\n", " order_id INTEGER PRIMARY KEY,\n", " customer_id INTEGER REFERENCES nla_demo.customers(customer_id),\n", " order_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP,\n", " total_amount DECIMAL(10, 2) NOT NULL,\n", " shipping_address_id INTEGER REFERENCES nla_demo.addresses(address_id),\n", " billing_address_id INTEGER REFERENCES nla_demo.addresses(address_id),\n", " order_status VARCHAR(50)\n", " );\n", " \"\"\",\n", " \"\"\"\n", " CREATE TABLE IF NOT EXISTS nla_demo.order_items (\n", " order_item_id SERIAL PRIMARY KEY,\n", " order_id INTEGER REFERENCES nla_demo.orders(order_id),\n", " product_id INTEGER REFERENCES nla_demo.products(product_id),\n", " quantity INTEGER NOT NULL,\n", " price DECIMAL(10, 2) NOT NULL\n", " );\n", " \"\"\",\n", "\n", " # Populate tables (Only if they are existing and empty)\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.addresses) THEN\n", " INSERT INTO nla_demo.addresses (street_address, city, country)\n", " VALUES\n", " ('1800 Amphibious Blvd', 'Mountain View', 'USA'),\n", " ('Avenida da Pastelaria, 1903', 'Lisbon', 'Portugal'),\n", " ('8 Rue du Nom Fictif 341', 'Paris', 'France');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.customers) THEN\n", " INSERT INTO nla_demo.customers (first_name, last_name, email, address_id, date_of_birth)\n", " VALUES\n", " ('Alex', 'B.', 'alex.b@example.com', 1, '2003-02-20'),\n", " ('Amal', 'M.', 'amal.m@example.com', 2, '1998-11-08'),\n", " ('Dani', 'G.', 'dani.g@example.com', 3, '2002-07-25');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.categories) THEN\n", " INSERT INTO nla_demo.categories (category_id, category_name)\n", " VALUES\n", " (1, 'Accessories'),\n", " (2, 'Apparel'),\n", " (3, 'Footwear'),\n", " (4, 'Swimwear');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.brands) THEN\n", " INSERT INTO nla_demo.brands (brand_id, brand_name)\n", " VALUES\n", " (1, 'CymbalPrime'),\n", " (2, 'CymbalPro'),\n", " (3, 'CymbalSports');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.products) THEN\n", " INSERT INTO nla_demo.products (product_id, brand_id, category_id, name)\n", " VALUES\n", " (1, 1, 2, 'Hoodie'),\n", " (2, 1, 3, 'Running Shoes'),\n", " (3, 2, 4, 'Swimsuit'),\n", " (4, 3, 1, 'Tote Bag'),\n", " (5, 3, 3, 'CymbalShoe');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.orders) THEN\n", " INSERT INTO nla_demo.orders (order_id, customer_id, total_amount, shipping_address_id, billing_address_id, order_status)\n", " VALUES\n", " (1, 1, 99.99, 1, 1, 'Shipped'),\n", " (2, 1, 69.99, 1, 1, 'Delivered'),\n", " (3, 2, 20.99, 2, 2, 'Processing'),\n", " (4, 3, 79.99, 3, 3, 'Shipped');\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 FROM nla_demo.order_items) THEN\n", " INSERT INTO nla_demo.order_items (order_id, product_id, quantity, price)\n", " VALUES\n", " (1, 1, 1, 79.99),\n", " (1, 3, 1, 20.00),\n", " (2, 4, 1, 69.99),\n", " (3, 3, 1, 20.00),\n", " (4, 2, 1, 79.99);\n", " END IF;\n", " END $$;\n", " \"\"\",\n", "\n", " # Create a natural language configuration \n", " # alloydb_ai_nl.g_create_configuration creates the model.\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (SELECT 1 from alloydb_ai_nl.g_magic_configuration where configuration_id = 'nla_demo_cfg') THEN\n", " PERFORM alloydb_ai_nl.g_create_configuration( 'nla_demo_cfg' );\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " SELECT alloydb_ai_nl.g_manage_configuration(\n", " operation => 'register_table_view',\n", " configuration_id_in => 'nla_demo_cfg',\n", " table_views_in=>'{nla_demo.customers, nla_demo.addresses, nla_demo.brands, nla_demo.products, nla_demo.categories, nla_demo.orders, nla_demo.order_items}'\n", " );\n", " \"\"\",\n", "\n", " # Create and apply context\n", " \"\"\"\n", " SELECT alloydb_ai_nl.generate_schema_context(\n", " 'nla_demo_cfg',\n", " TRUE\n", " );\n", " \"\"\",\n", "\n", " # Enable parametrized views to get AlloyDB query responses\n", " \"\"\"\n", " CREATE EXTENSION IF NOT EXISTS parameterized_views;\n", " \"\"\"\n", "]" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "K0ksQv6wFroJ" }, "outputs": [], "source": [ "from google.cloud.alloydb.connector import AsyncConnector\n", "\n", "# Create table and insert data\n", "async def run_setup(pool):\n", " async with pool.connect() as db_conn:\n", " for query in setup_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", " await db_conn.commit()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "70aEX9F_Mxll" }, "outputs": [], "source": [ "pool = await init_connection_pool(connector, database_name)\n", "await run_setup(pool)" ] }, { "cell_type": "markdown", "metadata": { "id": "6jggEDlKjJ2O" }, "source": [ "### Set up NL to SQL capability of AlloyDB\n", "\n", "Now, we will be doing some setup to ensure the working of the NL to SQL\n", "capability of AlloyBD\n", "\n", "1. **Verify and updated generated context:** Verify the generated context for\n", " the tables and update any that needs revision.\n", "1. **Construct the value index:** The AlloyDB AI natural language API produces accurate SQL queries by using value linking. Value linking associates value phrases in natural language statements with pre-registered concept types and column names which can enrich the natural language question.\n", "1. **Define a query template:** You can define templates to improve the quality of the answers produced by the AlloyDB AI natural language API." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "rE53Z3U0YaR5" }, "outputs": [], "source": [ "verify_context_queries = [\n", " \"\"\"\n", " SELECT object_context\n", " FROM alloydb_ai_nl.generated_schema_context_view\n", " WHERE schema_object = 'nla_demo.products';\n", " \"\"\",\n", " \"\"\"\n", " SELECT object_context\n", " FROM alloydb_ai_nl.generated_schema_context_view\n", " WHERE schema_object = 'nla_demo.products.name';\n", " \"\"\"\n", "]\n", "\n", "\n", "update_context_queries = [\n", " \"\"\"\n", " SELECT alloydb_ai_nl.update_generated_relation_context(\n", " 'nla_demo.products',\n", " 'The \"nla_demo.products\" table stores product details such as ID, name, description, brand, category linkage, and record creation time.'\n", " );\n", " \"\"\",\n", " \"\"\"\n", " SELECT alloydb_ai_nl.update_generated_column_context(\n", " 'nla_demo.products.name',\n", " 'The \"name\" column in the \"nla_demo.products\" table contains the specific name or title of each product.'\n", " );\n", " \"\"\"\n", "]\n", "\n", "apply_generated_context_queries = [\n", " \"\"\"\n", " SELECT alloydb_ai_nl.apply_generated_relation_context(\n", " 'nla_demo.products', true\n", " );\n", " \"\"\",\n", " \"\"\"\n", " SELECT alloydb_ai_nl.apply_generated_column_context(\n", " 'nla_demo.products.name',\n", " true\n", " );\n", " \"\"\"\n", "]\n", "\n", "define_product_name_context_queries = [\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (select 1 from alloydb_ai_nl.concept_types_user_defined where type_name = 'product_name') THEN\n", " PERFORM alloydb_ai_nl.add_concept_type(\n", " concept_type_in => 'product_name',\n", " match_function_in => 'alloydb_ai_nl.get_concept_and_value_generic_entity_name',\n", " additional_info_in => '{\n", " \"description\": \"Concept type for product name.\",\n", " \"examples\": \"SELECT alloydb_ai_nl.get_concept_and_value_generic_entity_name(''Camera'')\" }'::jsonb\n", " );\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " SELECT alloydb_ai_nl.associate_concept_type(\n", " 'nla_demo.products.name',\n", " 'product_name',\n", " 'nla_demo_cfg'\n", " );\n", " \"\"\",\n", "]\n", "\n", "verify_product_name_concept_query = \"SELECT alloydb_ai_nl.list_concept_types();\"\n", "\n", "verify_product_name_association_query = \"\"\"\n", " SELECT *\n", " FROM alloydb_ai_nl.value_index_columns\n", " WHERE column_names = 'nla_demo.products.name';\n", "\"\"\"\n", "\n", "define_brand_name_concept_queries = [\n", " \"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (select 1 from alloydb_ai_nl.concept_types_user_defined where type_name = 'brand_name') THEN\n", " PERFORM alloydb_ai_nl.add_concept_type(\n", " concept_type_in => 'brand_name',\n", " match_function_in => 'alloydb_ai_nl.get_concept_and_value_generic_entity_name',\n", " additional_info_in => '{\n", " \"description\": \"Concept type for brand name.\",\n", " \"examples\": \"SELECT alloydb_ai_nl.get_concept_and_value_generic_entity_name(''CymbalPrime'')\" }'::jsonb\n", " );\n", " END IF;\n", " END $$;\n", " \"\"\",\n", " \"\"\"\n", " SELECT alloydb_ai_nl.associate_concept_type(\n", " 'nla_demo.brands.brand_name',\n", " 'brand_name',\n", " 'nla_demo_cfg'\n", " );\n", " \"\"\",\n", "]\n", "\n", "construct_value_index_queries = [\n", " \"\"\"SELECT alloydb_ai_nl.create_value_index('nla_demo_cfg');\"\"\",\n", " \"\"\"SELECT alloydb_ai_nl.refresh_value_index('nla_demo_cfg');\"\"\"\n", "]\n", "\n", "query_template_sql = \"\"\"\n", " SELECT c.first_name, c.last_name FROM nla_demo.Customers c JOIN nla_demo.orders o ON c.customer_id = o.customer_id JOIN nla_demo.order_items oi ON o.order_id = oi.order_id JOIN nla_demo.products p ON oi.product_id = p.product_id AND p.name = ''Swimsuit''\n", "\"\"\"\n", "\n", "define_query_template_query = f\"\"\"\n", " DO $$\n", " BEGIN\n", " IF NOT EXISTS (select 1 from alloydb_ai_nl.g_template_store where template_sql = '{query_template_sql}') THEN\n", " PERFORM alloydb_ai_nl.add_template(\n", " nl_config_id => 'nla_demo_cfg',\n", " intent => 'List the first names and the last names of all customers who ordered Swimsuit.',\n", " sql => '{query_template_sql}',\n", " sql_explanation => 'To answer this question, JOIN `nla_demo.Customers` with `nla_demo.orders` on having the same `customer_id`, and JOIN the result with nla_demo.order_items on having the same `order_id`. Then JOIN the result with `nla_demo.products` on having the same `product_id`, and filter rwos that with p.name = ''Swimsuit''. Return the `first_name` and the `last_name` of the customers with matching records.',\n", " check_intent => TRUE\n", " );\n", " END IF;\n", " END $$;\n", "\n", "\"\"\"\n", "\n", "view_added_template_query = \"\"\"\n", " SELECT nl, sql, intent, psql, pintent\n", " FROM alloydb_ai_nl.template_store_view\n", " WHERE config = 'nla_demo_cfg';\n", "\"\"\"" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "R_wE20rxbBUu", "outputId": "0f988ca2-aa54-4db1-f95c-009ac3b32ee2" }, "outputs": [], "source": [ "async def run_queries(pool):\n", " async with pool.connect() as db_conn:\n", "\n", " # Verify the generated context for the tables\n", " for query in verify_context_queries:\n", " response = await db_conn.execute(sqlalchemy.text(query))\n", " print(\"Verify the context:\", response.mappings().all())\n", "\n", " \n", " # Update context that needs revision\n", " for query in update_context_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", "\n", " \n", " # The resulting context entries in the alloydb_ai_nl.generated_schema_context_view \n", " # view are applied to the corresponding schema objects, and the comments are overwritten.\n", " for query in apply_generated_context_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", "\n", " # Define the product_name concept type and associate it with the nla_demo.products.name column\n", " for query in define_product_name_context_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", "\n", " # Verify that the product_name concept type is added to the list of concept types\n", " response = await db_conn.execute(sqlalchemy.text(verify_product_name_concept_query))\n", " print(\"Verify the product name concept:\", response.mappings().all())\n", "\n", " # Verify that the nla_demo.products.name column is associated with the product_name concept type\n", " response = await db_conn.execute(sqlalchemy.text(verify_product_name_association_query))\n", " print(\"Verify the product name association:\", response.mappings().all())\n", "\n", " # Define the brand_name concept type and associate it with the nla_demo.brands.brand_name column\n", " for query in define_brand_name_concept_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", "\n", " # Create a value index\n", " for query in construct_value_index_queries:\n", " await db_conn.execute(sqlalchemy.text(query))\n", "\n", " # Add a template (This helps in improving accuracy for critical questions)\n", " await db_conn.execute(sqlalchemy.text(define_query_template_query))\n", "\n", " # View a list of added templates\n", " response = await db_conn.execute(sqlalchemy.text(view_added_template_query))\n", " print(\"View added template:\", response.mappings().all())\n", "\n", " await db_conn.commit()\n", "\n", "# pool = await init_connection_pool(connector, database_name)\n", "await run_queries(pool)\n", "await pool.dispose()" ] }, { "cell_type": "markdown", "metadata": { "id": "Bl1IeaqZbMYh" }, "source": [ "## Step 2: Set up Toolbox\n", "\n", "Here, we will set up the Toolbox server to interact with our AlloyDB Database." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Download the [latest](https://github.com/googleapis/genai-toolbox/releases) version of Toolbox as a binary." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "lbsQ1Aa-IszB", "outputId": "35aa4018-6e37-48fa-e8a2-79e2e1a85695" }, "outputs": [], "source": [ "version = \"0.17.0\" # x-release-please-version\n", "! curl -L -o /content/toolbox https://storage.googleapis.com/genai-toolbox/v{version}/linux/amd64/toolbox\n", "\n", "# Make the binary executable\n", "! chmod +x /content/toolbox" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Ovlzi2RVJGM5" }, "outputs": [], "source": [ "TOOLBOX_BINARY_PATH = \"/content/toolbox\"\n", "SERVER_PORT = 5000" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "Jje8N5fScchw" }, "outputs": [], "source": [ "# Create a tools file\n", "tools_file_name = \"tools.yml\"\n", "file_content = f\"\"\"\n", "sources:\n", " my-alloydb-pg-source:\n", " kind: alloydb-postgres\n", " project: {project_id}\n", " region: {region}\n", " cluster: {cluster_name}\n", " instance: {instance_name}\n", " database: {database_name}\n", " user: postgres\n", " password: {password}\n", "tools:\n", " ask_questions:\n", " kind: alloydb-ai-nl\n", " source: my-alloydb-pg-source\n", " description: 'Ask any natural language questions about the tables'\n", " nlConfig: 'nla_demo_cfg'\n", " basic_sql:\n", " kind: postgres-sql\n", " source: my-alloydb-pg-source\n", " description: 'Check if db is connected'\n", " statement: SELECT * from nla_demo.products;\n", "\"\"\"" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "JPNXr4y58tMH" }, "outputs": [], "source": [ "# Write the file content into the tools file.\n", "! echo \"{file_content}\" > \"{tools_file_name}\"" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "5ZH5VuYzdP_W" }, "outputs": [], "source": [ "TOOLS_FILE_PATH = f\"/content/{tools_file_name}\"" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "iZGQzYUF-pho" }, "outputs": [], "source": [ "# Start a toolbox server\n", "! nohup {TOOLBOX_BINARY_PATH} --tools-file {TOOLS_FILE_PATH} --log-level debug --logging-format json -p {SERVER_PORT} > toolbox.log 2>&1 &" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "1PJpKOBieKOV", "outputId": "7bae4851-a54b-4533-e714-35e341a1f4d7" }, "outputs": [], "source": [ "# Check if toolbox is running\n", "!sudo lsof -i :{SERVER_PORT}" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "ReQHko3TxOFg" }, "outputs": [], "source": [ "# Kill process at port\n", "# !lsof -t -i :{SERVER_PORT} | xargs kill -9" ] }, { "cell_type": "markdown", "metadata": { "id": "QqRlWqvYNKSo" }, "source": [ "## Step 3: Connect Using ADK\n", "\n", "Now, we will use ADK to connect to the server and answer natural language\n", "questions related to our database." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "dhQTKlpVNKSo", "outputId": "7f836e74-d82a-4921-f35e-74651581ef47" }, "outputs": [], "source": [ "! pip install toolbox-core --quiet\n", "! pip install google-adk --quiet" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "tSLO_0vKNKSo", "outputId": "4cbfd5ad-63d4-45b9-ae24-27b76d4836ef" }, "outputs": [], "source": [ "from google.adk.agents import Agent\n", "from google.adk.runners import Runner\n", "from google.adk.sessions import InMemorySessionService\n", "from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService\n", "from google.genai import types\n", "from toolbox_core import ToolboxSyncClient\n", "\n", "import os\n", "\n", "# TODO(developer): replace this with your Google API key\n", "os.environ['GOOGLE_API_KEY'] = \"your-api-key\"\n", "\n", "toolbox_client = ToolboxSyncClient(f\"http://127.0.0.1:{SERVER_PORT}\")\n", "\n", "prompt = \"\"\"You're a helpful assistant to fetch queries for an e-commerce application.\"\"\"\n", "\n", "tools = toolbox_client.load_toolset()\n", "\n", "root_agent = Agent(\n", " model='gemini-2.0-flash',\n", " name='alloy_nl_agent',\n", " description='A helpful AI assistant.',\n", " instruction=prompt,\n", " tools=tools,\n", ")\n", "\n", "session_service = InMemorySessionService()\n", "artifacts_service = InMemoryArtifactService()\n", "session = await session_service.create_session(\n", " state={}, app_name='alloy_nl_agent', user_id='123'\n", ")\n", "runner = Runner(\n", " app_name='alloy_nl_agent',\n", " agent=root_agent,\n", " artifact_service=artifacts_service,\n", " session_service=session_service,\n", ")\n", "\n", "queries = [\n", " \"Find the customers who purchased Tote Bag.\",\n", " \"List the maximum price of any CymbalShoe.\", # This will return null. This data is not in the dataset.\n", " \"List the maximum price of any CymbalPrime.\",\n", " \"Find the last name of the customers who live in France.\"\n", "]\n", "\n", "for query in queries:\n", " content = types.Content(role='user', parts=[types.Part(text=query)])\n", " events = runner.run(session_id=session.id,\n", " user_id='123', new_message=content)\n", "\n", " responses = (\n", " part.text\n", " for event in events\n", " for part in event.content.parts\n", " if part.text is not None\n", " )\n", "\n", " for text in responses:\n", " print(text)" ] } ], "metadata": { "colab": { "provenance": [] }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "name": "python" } }, "nbformat": 4, "nbformat_minor": 0 }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/googleapis/genai-toolbox'

If you have feedback or need assistance with the MCP directory API, please join our Discord server