Skip to main content
Glama
tyson-swetnam

EPA Air Quality System (AQS) MCP Server

aqs_monitors_by_cbsa

Retrieve air quality monitoring locations for specific pollutants within metropolitan or micropolitan statistical areas using EPA data.

Instructions

Get all air quality monitors in a Core Based Statistical Area (CBSA). CBSAs are metropolitan or micropolitan statistical areas defined by the Office of Management and Budget.

Parameters:

  • param: 5-digit AQS parameter code for the pollutant. Common codes:

    • 44201: Ozone (O3)

    • 88101: PM2.5 (Fine Particulate Matter, Local Conditions)

    • 81102: PM10 (Particulate Matter)

    • 42401: Sulfur Dioxide (SO2)

    • 42101: Carbon Monoxide (CO)

    • 42602: Nitrogen Dioxide (NO2)

  • bdate/edate: Begin and end dates in YYYYMMDD format (must be same calendar year)

  • cbsa: 5-digit CBSA code. Examples:

    • 31080: Los Angeles-Long Beach-Anaheim, CA

    • 35620: New York-Newark-Jersey City, NY-NJ-PA

    • 16980: Chicago-Naperville-Elgin, IL-IN-WI

    • 19100: Dallas-Fort Worth-Arlington, TX

    • 26420: Houston-The Woodlands-Sugar Land, TX

    • 38060: Phoenix-Mesa-Chandler, AZ

Note: Email and API key can be provided or will use AQS_EMAIL/AQS_API_KEY environment variables.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
emailNoEmail address for API authentication (optional if AQS_EMAIL env var is set)
keyNoAPI key for authentication (optional if AQS_API_KEY env var is set)
paramYes5-digit AQS parameter code (e.g., 44201 for Ozone)
bdateYesBegin date in YYYYMMDD format
edateYesEnd date in YYYYMMDD format (must be same calendar year as bdate)
cbsaYes5-digit CBSA code (e.g., 31080 for Los Angeles metro area)
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden and does well by disclosing authentication behavior (environment variable fallback), date constraints (same calendar year requirement), and provides context about CBSA definitions. However, it doesn't mention rate limits, pagination, or error behaviors that would be helpful for a data retrieval tool.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with purpose statement, parameter explanations with examples, and authentication note. Every sentence adds value, though the CBSA definition could be slightly more concise. The information is front-loaded with the core purpose first.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a data retrieval tool with 6 parameters, 100% schema coverage, but no output schema or annotations, the description provides good parameter context and authentication details. However, it lacks information about return format, data volume, or error handling that would help the agent use this tool effectively.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

The description adds substantial value beyond the 100% schema coverage by providing specific examples of parameter codes (e.g., 44201 for Ozone) and CBSA codes with real-world geographic mappings (e.g., 31080 for Los Angeles). This transforms abstract parameter definitions into practical, actionable information for the agent.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the verb 'Get' and resource 'air quality monitors' with specific geographic scope 'in a Core Based Statistical Area (CBSA)', and explains what CBSAs are. It distinguishes from siblings like aqs_monitors_by_box, aqs_monitors_by_county, etc. by specifying the CBSA-based filtering approach.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description implies usage context through the CBSA explanation and parameter examples, but doesn't explicitly state when to use this tool versus alternatives like aqs_monitors_by_county or aqs_monitors_by_state. It provides parameter guidance but no comparative usage recommendations with sibling tools.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tyson-swetnam/aqs-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server