create-flink-statement
Submit Flink SQL statements to process data streams in Confluent Cloud. Define statements with unique names, catalog, and database details for efficient execution within specified environments and compute pools.
Instructions
Make a request to create a statement.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
baseUrl | No | The base URL of the Flink REST API. | |
catalogName | Yes | The catalog name to be used for the statement. Typically the confluent environment name. | |
computePoolId | No | The id associated with the compute pool in context. | |
databaseName | Yes | The database name to be used for the statement. Typically the Kafka cluster name. | |
environmentId | No | The unique identifier for the environment. | |
organizationId | No | The unique identifier for the organization. | |
statement | Yes | The raw Flink SQL text statement. Create table statements may not be necessary as topics in confluent cloud will be detected as created schemas. Make sure to show and describe tables before creating new ones. | |
statementName | Yes | The user provided name of the resource, unique within this environment. |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from mcp-confluent
- add-tags-to-topic
- alter-topic-config
- create-connector
- create-flink-statement
- create-topics
- create-topic-tags
- delete-connector
- delete-flink-statements
- delete-tag
- delete-topics
- list-clusters
- list-connectors
- list-environments
- list-flink-statements
- list-schemas
- list-tags
- list-topics
- produce-message
- read-connector
- read-environment
- read-flink-statement
- remove-tag-from-entity
- search-topics-by-name
- search-topics-by-tag
Related Tools
- @confluentinc/mcp-confluent
- @confluentinc/mcp-confluent
- @confluentinc/mcp-confluent
- @confluentinc/mcp-confluent
- @ktanaka101/mcp-server-duckdb