Skip to main content
Glama
Unstructured-IO

Unstructured API MCP Server

Official

update_destination_connector

Modify configuration settings for data destination connectors like AstraDB, Databricks, MongoDB, Neo4j, Pinecone, S3, or Weaviate to adjust storage parameters and batch processing.

Instructions

Update a destination connector based on type.

Args:
    ctx: Context object with the request and lifespan context
    destination_id: ID of the destination connector to update
    destination_type: The type of destination being updated

    type_specific_config:
        astradb:
            collection_name: (Optional[str]): The AstraDB collection name
            keyspace: (Optional[str]): The AstraDB keyspace
            batch_size: (Optional[int]) The batch size for inserting documents
        databricks_delta_table:
            catalog: (Optional[str]): Name of the catalog in Databricks Unity Catalog
            database: (Optional[str]): The database in Unity Catalog
            http_path: (Optional[str]): The cluster’s or SQL warehouse’s HTTP Path value
            server_hostname: (Optional[str]): The Databricks cluster’s or SQL warehouse’s
                             Server Hostname value
            table_name: (Optional[str]): The name of the table in the schema
            volume: (Optional[str]): Name of the volume associated with the schema.
            schema: (Optional[str]) Name of the schema associated with the volume
            volume_path: (Optional[str]) Any target folder path within the volume, starting
                        from the root of the volume.
        databricks_volumes:
            catalog: (Optional[str]): Name of the catalog in Databricks
            host: (Optional[str]): The Databricks host URL
            volume: (Optional[str]): Name of the volume associated with the schema
            schema: (Optional[str]) Name of the schema associated with the volume. The default
                     value is "default".
            volume_path: (Optional[str]) Any target folder path within the volume,
                        starting from the root of the volume.
        mongodb:
            database: (Optional[str]): The name of the MongoDB database
            collection: (Optional[str]): The name of the MongoDB collection
        neo4j:
            database: (Optional[str]): The Neo4j database, e.g. "neo4j"
            uri: (Optional[str]): The Neo4j URI
                  e.g. neo4j+s://<neo4j_instance_id>.databases.neo4j.io
            batch_size: (Optional[int]) The batch size for the connector
        pinecone:
            index_name: (Optional[str]): The Pinecone index name
            namespace: (Optional[str]) The pinecone namespace, a folder inside the
                       pinecone index
            batch_size: (Optional[int]) The batch size
        s3:
            remote_url: (Optional[str]): The S3 URI to the bucket or folder
        weaviate:
            cluster_url: (Optional[str]): URL of the Weaviate cluster
            collection: (Optional[str]): Name of the collection in the Weaviate cluster

            Note: Minimal schema is required for the collection, e.g. record_id: Text

Returns:
    String containing the updated destination connector information

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
destination_idYes
destination_typeYes
type_specific_configYes

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes

Implementation Reference

  • The primary handler for the 'update_destination_connector' tool. Dispatches to type-specific update functions based on the destination_type, handling input validation via type hints and docstring.
    async def update_destination_connector(
        ctx: Context,
        destination_id: str,
        destination_type: Literal[
            "astradb",
            "databricks_delta_table",
            "databricks_volumes",
            "mongodb",
            "neo4j",
            "pinecone",
            "s3",
            "weaviate",
        ],
        type_specific_config: dict[str, Any],
    ) -> str:
        """Update a destination connector based on type.
    
        Args:
            ctx: Context object with the request and lifespan context
            destination_id: ID of the destination connector to update
            destination_type: The type of destination being updated
    
            type_specific_config:
                astradb:
                    collection_name: (Optional[str]): The AstraDB collection name
                    keyspace: (Optional[str]): The AstraDB keyspace
                    batch_size: (Optional[int]) The batch size for inserting documents
                databricks_delta_table:
                    catalog: (Optional[str]): Name of the catalog in Databricks Unity Catalog
                    database: (Optional[str]): The database in Unity Catalog
                    http_path: (Optional[str]): The cluster’s or SQL warehouse’s HTTP Path value
                    server_hostname: (Optional[str]): The Databricks cluster’s or SQL warehouse’s
                                     Server Hostname value
                    table_name: (Optional[str]): The name of the table in the schema
                    volume: (Optional[str]): Name of the volume associated with the schema.
                    schema: (Optional[str]) Name of the schema associated with the volume
                    volume_path: (Optional[str]) Any target folder path within the volume, starting
                                from the root of the volume.
                databricks_volumes:
                    catalog: (Optional[str]): Name of the catalog in Databricks
                    host: (Optional[str]): The Databricks host URL
                    volume: (Optional[str]): Name of the volume associated with the schema
                    schema: (Optional[str]) Name of the schema associated with the volume. The default
                             value is "default".
                    volume_path: (Optional[str]) Any target folder path within the volume,
                                starting from the root of the volume.
                mongodb:
                    database: (Optional[str]): The name of the MongoDB database
                    collection: (Optional[str]): The name of the MongoDB collection
                neo4j:
                    database: (Optional[str]): The Neo4j database, e.g. "neo4j"
                    uri: (Optional[str]): The Neo4j URI
                          e.g. neo4j+s://<neo4j_instance_id>.databases.neo4j.io
                    batch_size: (Optional[int]) The batch size for the connector
                pinecone:
                    index_name: (Optional[str]): The Pinecone index name
                    namespace: (Optional[str]) The pinecone namespace, a folder inside the
                               pinecone index
                    batch_size: (Optional[int]) The batch size
                s3:
                    remote_url: (Optional[str]): The S3 URI to the bucket or folder
                weaviate:
                    cluster_url: (Optional[str]): URL of the Weaviate cluster
                    collection: (Optional[str]): Name of the collection in the Weaviate cluster
    
                    Note: Minimal schema is required for the collection, e.g. record_id: Text
    
        Returns:
            String containing the updated destination connector information
        """
        update_functions = {
            "astradb": update_astradb_destination,
            "databricks_delta_table": update_databricks_delta_table_destination,
            "databricks_volumes": update_databricks_volumes_destination,
            "mongodb": update_mongodb_destination,
            "neo4j": update_neo4j_destination,
            "pinecone": update_pinecone_destination,
            "s3": update_s3_destination,
            "weaviate": update_weaviate_destination,
        }
    
        if destination_type in update_functions:
            update_function = update_functions[destination_type]
            return await update_function(ctx=ctx, destination_id=destination_id, **type_specific_config)
    
        return (
            f"Unsupported destination type: {destination_type}. "
            f"Please use a supported destination type: {list(update_functions.keys())}."
        )
  • Registers the update_destination_connector tool (along with create and delete) using mcp.tool() decorator on the FastMCP server.
    def register_destination_connectors(mcp: FastMCP):
        """Register all destination connector tools with the MCP server."""
        mcp.tool()(create_destination_connector)
        mcp.tool()(update_destination_connector)
        mcp.tool()(delete_destination_connector)
  • Example type-specific helper function for updating S3 destination connectors, invoked by the main handler. Similar implementations exist for other destination types.
    async def update_s3_destination(
        ctx: Context,
        destination_id: str,
        remote_url: Optional[str] = None,
        recursive: Optional[bool] = None,
    ) -> str:
        """Update an S3 destination connector.
    
        Args:
            destination_id: ID of the destination connector to update
            remote_url: The S3 URI to the bucket or folder
    
        Returns:
            String containing the updated destination connector information
        """
        client = ctx.request_context.lifespan_context.client
    
        # Get the current destination connector configuration
        try:
            get_response = await client.destinations.get_destination_async(
                request=GetDestinationRequest(destination_id=destination_id),
            )
            current_config = get_response.destination_connector_information.config
        except Exception as e:
            return f"Error retrieving destination connector: {str(e)}"
    
        # Update configuration with new values
        config = dict(current_config)
    
        if remote_url is not None:
            config["remote_url"] = remote_url
        if recursive is not None:
            config["recursive"] = recursive
    
        destination_connector = UpdateDestinationConnector(config=config)
    
        try:
            response = await client.destinations.update_destination_async(
                request=UpdateDestinationRequest(
                    destination_id=destination_id,
                    update_destination_connector=destination_connector,
                ),
            )
    
            result = create_log_for_created_updated_connector(
                response,
                connector_name="S3",
                connector_type="Destination",
                created_or_updated="Updated",
            )
            return result
        except Exception as e:
            return f"Error updating S3 destination connector: {str(e)}"
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

With no annotations provided, the description carries the full burden of behavioral disclosure. It states this is an update operation, implying mutation, but doesn't cover critical aspects like required permissions, whether changes are reversible, error handling, or rate limits. The Returns section mentions output format, but lacks details on success/failure responses or side effects.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness3/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is appropriately front-loaded with the core purpose, but the extensive parameter documentation (while valuable) makes it lengthy. The structure with 'Args' and 'Returns' sections is clear, but some redundancy exists (e.g., repeating 'Optional' annotations). Every sentence earns its place, but it could be more streamlined.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool's complexity (3 parameters with nested objects, no annotations, but has output schema), the description is largely complete. It covers the core purpose and detailed parameter semantics, and the output schema existence means return values needn't be explained. However, it lacks behavioral context like error cases or mutation implications, leaving minor gaps.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters5/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

The description adds substantial value beyond the input schema, which has 0% description coverage. It thoroughly documents the 'type_specific_config' parameter by listing all supported destination types (matching the enum) and their optional fields with clear explanations, making the parameter semantics explicit and actionable for an AI agent.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the verb ('Update') and resource ('a destination connector based on type'), making the purpose evident. It distinguishes this tool from siblings like 'create_destination_connector' and 'delete_destination_connector' by specifying it's for updates, though it doesn't explicitly differentiate from 'update_source_connector' or 'update_workflow' beyond the resource name.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description provides no guidance on when to use this tool versus alternatives. It doesn't mention prerequisites (e.g., needing an existing destination connector), exclusions, or comparisons to sibling tools like 'create_destination_connector' for initial setup or 'get_destination_info' for checking current settings.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Unstructured-IO/UNS-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server