read-flink-statement
Retrieve and analyze Flink statements and their results by querying the Flink REST API. Specify the statement name, organization, and environment to access real-time data streams or sampled results.
Instructions
Make a request to read a statement and its results
Input Schema
Name | Required | Description | Default |
---|---|---|---|
baseUrl | No | The base URL of the Flink REST API. | |
environmentId | No | The unique identifier for the environment. | |
organizationId | No | The unique identifier for the organization. | |
statementName | Yes | The user provided name of the resource, unique within this environment. | |
timeoutInMilliseconds | No | The function implements pagination. It will continue to fetch results using the next page token until either there are no more results or the timeout is reached. Tables backed by kafka topics can be thought of as never-ending streams as data could be continuously produced in near real-time. Therefore, if you wish to sample values from a stream, you may want to set a timeout. If you are reading a statement after creating it, you may need to retry a couple times to ensure that the statement is ready and receiving data. |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from mcp-confluent
- add-tags-to-topic
- alter-topic-config
- create-connector
- create-flink-statement
- create-topics
- create-topic-tags
- delete-connector
- delete-flink-statements
- delete-tag
- delete-topics
- list-clusters
- list-connectors
- list-environments
- list-flink-statements
- list-schemas
- list-tags
- list-topics
- produce-message
- read-connector
- read-environment
- read-flink-statement
- remove-tag-from-entity
- search-topics-by-name
- search-topics-by-tag
Related Tools
- @confluentinc/mcp-confluent
- @confluentinc/mcp-confluent
- @confluentinc/mcp-confluent
- @mabeldata/pocketbase-mcp
- @mtane0412/twitch-mcp-server