Skip to main content
Glama
caption.md2.02 kB
# Caption ![pipeline](../../images/pipeline.png#only-light) ![pipeline](../../images/pipeline-dark.png#only-dark) The caption pipeline reads a list of images and returns a list of captions for those images. ## Example The following shows a simple example using this pipeline. ```python from txtai.pipeline import Caption # Create and run pipeline caption = Caption() caption("path to image file") ``` See the link below for a more detailed example. | Notebook | Description | | |:----------|:-------------|------:| | [Generate image captions and detect objects](https://github.com/neuml/txtai/blob/master/examples/25_Generate_image_captions_and_detect_objects.ipynb) | Captions and object detection for images | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/neuml/txtai/blob/master/examples/25_Generate_image_captions_and_detect_objects.ipynb) | ## Configuration-driven example Pipelines are run with Python or configuration. Pipelines can be instantiated in [configuration](../../../api/configuration/#pipeline) using the lower case name of the pipeline. Configuration-driven pipelines are run with [workflows](../../../workflow/#configuration-driven-example) or the [API](../../../api#local-instance). ### config.yml ```yaml # Create pipeline using lower case class name caption: # Run pipeline with workflow workflow: caption: tasks: - action: caption ``` ### Run with Workflows ```python from txtai import Application # Create and run pipeline with workflow app = Application("config.yml") list(app.workflow("caption", ["path to image file"])) ``` ### Run with API ```bash CONFIG=config.yml uvicorn "txtai.api:app" & curl \ -X POST "http://localhost:8000/workflow" \ -H "Content-Type: application/json" \ -d '{"name":"caption", "elements":["path to image file"]}' ``` ## Methods Python documentation for the pipeline. ### ::: txtai.pipeline.Caption.__init__ ### ::: txtai.pipeline.Caption.__call__

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/neuml/txtai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server