Skip to main content
Glama

batch_update_bitable_records

Update multiple records in Feishu Bitable tables in bulk operations, handling up to 1000 records at once by specifying record IDs and field values.

Instructions

    批量更新多维表格记录(最多1000条)

    参数:
        app_token: 多维表格的token
        table_id: 数据表ID
        records: 要更新的记录列表,每条记录包含record_id和fields

    返回:
        更新后的记录信息
    

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
app_tokenYes
table_idYes
recordsYes

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes

Implementation Reference

  • The main handler function for the 'batch_update_bitable_records' tool. It uses the Lark API to batch update records in a Bitable table, taking app_token, table_id, and a list of records (each with record_id and fields).
    @mcp.tool()
    @handle_feishu_error
    def batch_update_bitable_records(
        app_token: str, table_id: str, records: list[dict]
    ) -> str:
        """
        批量更新多维表格记录(最多1000条)
    
        参数:
            app_token: 多维表格的token
            table_id: 数据表ID
            records: 要更新的记录列表,每条记录包含record_id和fields
    
        返回:
            更新后的记录信息
        """
        client = get_client()
        request = (
            BatchUpdateAppTableRecordRequest.builder()
            .app_token(app_token)
            .table_id(table_id)
            .request_body(
                lark.BatchUpdateAppTableRecordRequestBody.builder()
                .children(records)
                .build()
            )
            .build()
        )
        response = client.bitable.v1.app_table_record.batch_update(request)
        return lark.JSON.marshal(response.data, indent=4)
  • Registers the batch_update_bitable_records tool (along with other bitable record tools) by calling the register_bitable_record_tools function on the MCP instance.
    register_bitable_record_tools(mcp)
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries full burden. It mentions the 1000-record limit (useful context) but lacks critical behavioral details: authentication requirements, error handling, whether updates are atomic, rate limits, or what happens with partial failures. For a batch mutation tool, this is a significant gap in transparency.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Well-structured with clear sections for purpose, parameters, and return. The purpose statement is front-loaded. Some redundancy exists (e.g., '参数:' could be omitted in a more concise format), but overall it's efficient with minimal waste.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness3/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given complexity (batch mutation with 3 parameters, no annotations) and an output schema exists (so return values are covered), the description is moderately complete. It covers basic purpose and parameters but lacks behavioral context and usage guidance, making it adequate but with clear gaps for a mutation tool.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 0%, so the description must compensate. It lists all three parameters with brief explanations, adding meaning beyond the bare schema. However, it doesn't detail the structure of 'records' (beyond mentioning record_id and fields) or provide examples, leaving gaps in parameter understanding.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the action ('批量更新' - batch update) and resource ('多维表格记录' - multi-dimensional table records), with a specific scope of up to 1000 records. It distinguishes from siblings like 'batch_create_bitable_records' and 'update_bitable_record' by emphasizing batch operations, though it doesn't explicitly name alternatives.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

No explicit guidance on when to use this tool versus alternatives like 'update_bitable_record' (single record update) or 'batch_create_bitable_records'. The description mentions the 1000-record limit but doesn't provide context for choosing between batch and single operations or other sibling tools.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ZYHB/yuppie-mcp-feishu'

If you have feedback or need assistance with the MCP directory API, please join our Discord server