Skip to main content
Glama
ChangooLee

MCP OpenDART

by ChangooLee

search_financial_notes

Search financial statement notes, business descriptions, and company overviews from South Korean corporate disclosures to find specific keywords in tables and paragraphs.

Instructions

추출된 재무제표 주석, 사업의 내용, 회사의 개요 데이터에서 테이블과 문단을 검색합니다. 연결재무제표 주석, 재무제표 주석, 사업의 내용, 회사의 개요에서 특정 키워드를 검색할 수 있습니다.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
rcp_noYes공시 접수번호 (예: 20241231000420). get_disclosure_list에서 조회 가능
search_termYes검색할 키워드 (예: CSM, 계약서비스마진, 보험계약자산)
section_typeNo검색 섹션 (all: 전체, consolidated_notes: 연결재무제표 주석, separate_notes: 재무제표 주석, business_content: 사업의 내용, company_overview: 회사의 개요)all
search_inNo검색 범위 (both: 테이블과 문단 모두, tables: 테이블만, paragraphs: 문단만)both
case_sensitiveNo대소문자 구분 여부 (기본값: False)
max_resultsNo최대 결과 수 (기본값: 50)
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations are provided, so the description carries full burden. It mentions searching capabilities but doesn't disclose important behavioral traits: whether this is a read-only operation, what permissions might be required, whether there are rate limits, what format the results return, or how pagination works. For a search tool with no annotation coverage, this leaves significant gaps in understanding how the tool behaves.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is appropriately concise with two sentences that directly state the tool's functionality. It's front-loaded with the core purpose and follows with additional capability details. There's no wasted verbiage, though it could potentially be structured more clearly for non-Korean readers.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool has 6 parameters, no annotations, and no output schema, the description is insufficiently complete. It doesn't explain what the search results look like, how they're structured, what happens when no matches are found, or any limitations of the search functionality. For a search tool with moderate complexity and no structured output documentation, the description should provide more context about the return values and search behavior.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, so the schema already fully documents all 6 parameters. The description mentions searching in tables and paragraphs, and searching specific sections, which aligns with the 'search_in' and 'section_type' parameters but doesn't add meaningful semantic context beyond what the schema provides. The baseline of 3 is appropriate when the schema does the heavy lifting.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the tool's purpose: searching tables and paragraphs in financial statement notes, business content, and company overview data. It specifies the verb 'search' and the resources (extracted financial statement notes, business content, company overview). However, it doesn't explicitly differentiate from sibling tools, which appear to be various data retrieval tools but none specifically for searching within financial documents.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides no guidance on when to use this tool versus alternatives. It mentions the tool can search specific sections (consolidated notes, separate notes, business content, company overview) but doesn't indicate when you would choose this over other data retrieval tools in the sibling list. No prerequisites, exclusions, or comparison to similar tools are provided.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ChangooLee/mcp-opendart'

If you have feedback or need assistance with the MCP directory API, please join our Discord server