Skip to main content
Glama

Census ACS Demographics

Server Details

Population, income, poverty, education, housing, and commuting from the US Census ACS

Status
Healthy
Last Tested
Transport
Streamable HTTP
URL

Glama MCP Gateway

Connect through Glama MCP Gateway for full control over tool access and complete visibility into every call.

MCP client
Glama
MCP server

Full call logging

Every tool call is logged with complete inputs and outputs, so you can debug issues and audit what your agents are doing.

Tool access control

Enable or disable individual tools per connector, so you decide what your agents can and cannot do.

Managed credentials

Glama handles OAuth flows, token storage, and automatic rotation, so credentials never expire on your clients.

Usage analytics

See which tools your agents call, how often, and when, so you can understand usage patterns and catch anomalies.

100% free. Your data is private.

Tool Definition Quality

Score is being calculated. Check back soon.

Available Tools

5 tools
get_commuting_dataAInspect

Get means of transportation to work data for counties.

Returns worker counts and percentages for: drove alone, carpooled,
public transit, walked, bicycle, taxi/motorcycle/other, and worked from home.

Args:
    state: Two-letter state abbreviation (e.g. 'WA', 'CA') or 2-digit FIPS code.
    county_fips: Three-digit county FIPS code (e.g. '033' for King County).
                 Omit to get all counties in the state.
    year: ACS 5-year estimate year (default 2022).
ParametersJSON Schema
NameRequiredDescriptionDefault
yearNo
stateYes
county_fipsNo

Output Schema

ParametersJSON Schema
NameRequiredDescription
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Discloses data source (ACS 5-year estimates), default year (2022), and batch retrieval behavior when county_fips is omitted; no annotations exist to contradict.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with clear separation between purpose statement and Args section; every sentence provides specific value without redundancy.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Appropriately complete given the tool's narrow scope and existing output schema; covers the ACS data source and all parameter behaviors sufficiently.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Essential compensation for 0% schema description coverage by providing detailed parameter semantics, examples ('WA', '033'), and default behaviors for all three arguments.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states it retrieves 'means of transportation to work data' with specific categories listed; implicitly distinguishes from demographic/economic siblings by focusing on commuting modalities.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Provides operational guidance on omitting county_fips to retrieve statewide data, but lacks explicit guidance on when to choose this tool over sibling tools like get_county_demographics.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

get_county_demographicsAInspect

Get demographic data for counties: population, median age, race, Hispanic origin, income, and poverty.

Returns one record per county with total population, median age, racial breakdown
(White, Black, American Indian, Asian, Pacific Islander, Other, Two+),
Hispanic/Latino percentage, median household income, and poverty rate.

Args:
    state: Two-letter state abbreviation (e.g. 'WA', 'CA') or 2-digit FIPS code.
    county_fips: Three-digit county FIPS code (e.g. '033' for King County).
                 Omit to get all counties in the state.
    year: ACS 5-year estimate year (default 2022). Data covers year-4 through year.
ParametersJSON Schema
NameRequiredDescriptionDefault
yearNo
stateYes
county_fipsNo

Output Schema

ParametersJSON Schema
NameRequiredDescription
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Discloses ACS 5-year data source, temporal coverage ('year-4 through year'), and return structure ('one record per county') beyond what annotations provide (none exist).

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with front-loaded purpose, clear Args section, and no redundant text; every sentence provides necessary information not found in structured fields.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Comprehensive coverage of inputs and outputs given the simple parameter structure; appropriately handles the lack of schema descriptions and presence of output schema.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema has 0% description coverage; description fully compensates with detailed semantics, examples (e.g., 'WA', '033'), and behavior notes for all three parameters.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clear specific verb ('Get') + resource ('demographic data for counties') and explicitly lists demographic fields to distinguish from siblings (economics, education, commuting).

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Implies differentiation through return value descriptions and notes that county_fips can be omitted for bulk retrieval, but lacks explicit 'when to use vs alternatives' guidance.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

get_county_economicsAInspect

Get economic data for counties: income, poverty, home values, rent, and health insurance.

Returns median household income, poverty rate, median home value, median gross rent,
and health insurance coverage rates (insured vs uninsured).

Args:
    state: Two-letter state abbreviation (e.g. 'WA', 'CA') or 2-digit FIPS code.
    county_fips: Three-digit county FIPS code (e.g. '033' for King County).
                 Omit to get all counties in the state.
    year: ACS 5-year estimate year (default 2022).
ParametersJSON Schema
NameRequiredDescriptionDefault
yearNo
stateYes
county_fipsNo

Output Schema

ParametersJSON Schema
NameRequiredDescription
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations to contradict; description carries burden well by explaining batch behavior (omit county_fips to get all counties) and identifying data source as ACS 5-year estimates.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with clear hierarchy (summary → returns → args), front-loaded purpose, and no redundancy despite inclusion of return values (which is acceptable given output schema exists).

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Comprehensive for a 3-parameter tool; covers required parameter constraints and optional parameter defaults, though could briefly note Census Bureau as the ACS data source.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema has 0% description coverage; description fully compensates with formats (Two-letter/FIPS), examples ('033' for King County), and default behaviors for all three parameters.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Specific verb+resource ('Get economic data') and lists distinct data types (income, poverty, home values) that clearly differentiate from sibling tools covering demographics, education, and commuting.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Implies usage through specificity of economic metrics but lacks explicit guidance on when to choose this over get_county_demographics or get_county_education.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

get_county_educationAInspect

Get educational attainment for counties (population 25+).

Returns counts and percentages for: less than high school, high school diploma/GED,
some college/associate degree, bachelor's degree, and graduate/professional degree.

Args:
    state: Two-letter state abbreviation (e.g. 'WA', 'CA') or 2-digit FIPS code.
    county_fips: Three-digit county FIPS code (e.g. '033' for King County).
                 Omit to get all counties in the state.
    year: ACS 5-year estimate year (default 2022).
ParametersJSON Schema
NameRequiredDescriptionDefault
yearNo
stateYes
county_fipsNo

Output Schema

ParametersJSON Schema
NameRequiredDescription
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Discloses key behaviors: ACS 5-year estimate source, population 25+ filter, and batch retrieval when county_fips is omitted (defaults to null).

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Efficient structure with purpose front-loaded, followed by return value summary and Args section; no redundant text.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Adequately complete given output schema exists; summarizes return categories without duplicating schema structure.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Comprehensive compensation for 0% schema description coverage by providing formats, examples (WA, 033), and default behavior for all three parameters.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clear specific purpose (educational attainment for counties, population 25+) but lacks explicit differentiation from sibling demographic tools.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Implies usage through data specificity and notes omitting county_fips returns all counties, but lacks explicit when-to-use vs alternatives guidance.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

get_tract_dataAInspect

Get tract-level ACS data for any variables within a county.

This is a flexible tool for querying any ACS 5-year estimate variables at
the census tract level. Automatically batches requests if more than 50
variables are requested.

Common variable examples:
- B01001_001E: Total population
- B19013_001E: Median household income
- B17001_002E: Population below poverty level
- B25077_001E: Median home value
- B02001_002E-008E: Race breakdown

Args:
    state: Two-letter state abbreviation (e.g. 'WA') or 2-digit FIPS code.
    county_fips: Three-digit county FIPS code (e.g. '033' for King County, WA).
    variables: Comma-separated ACS variable codes (e.g. 'B01001_001E,B19013_001E').
               NAME is always included automatically.
    year: ACS 5-year estimate year (default 2022).
ParametersJSON Schema
NameRequiredDescriptionDefault
yearNo
stateYes
variablesYes
county_fipsYes

Output Schema

ParametersJSON Schema
NameRequiredDescription
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Discloses important behavioral traits not in schema: automatic batching for >50 variables and that NAME field is always included automatically.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with front-loaded purpose, bulleted examples, and clear Args section; slightly verbose with 'flexible tool' fluff but every section adds value.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Comprehensive given input complexity; correctly omits output details since output schema exists, covers all parameters thoroughly, and includes helpful variable examples.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Excellent compensation for 0% schema description coverage; Args section provides detailed formats, examples (WA, 033), and constraints for every parameter.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states it retrieves tract-level ACS data with specific verb and resource, implicitly distinguishing from county-level siblings via geographic specificity, though lacks explicit sibling comparison.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines3/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Provides implied usage context through geographic level specification (tract vs county), but lacks explicit when-to-use/when-not-to-use guidance relative to sibling tools.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Discussions

No comments yet. Be the first to start the discussion!

Try in Browser

Your Connectors

Sign in to create a connector for this server.

Resources