Skip to main content
Glama

LangExtract MCP Server

by larsenweigle

extract_from_text

Extract structured data from unstructured text using Large Language Models. Define extraction instructions and examples to identify entities, map them to source locations, and retrieve precise metadata for accurate grounding.

Instructions

Extract structured information from text using langextract.

Uses Large Language Models to extract structured information from unstructured text based on user-defined instructions and examples. Each extraction is mapped to its exact location in the source text for precise source grounding.

Args: text: The text to extract information from prompt_description: Clear instructions for what to extract examples: List of example extractions to guide the model model_id: LLM model to use (default: "gemini-2.5-flash") max_char_buffer: Max characters per chunk (default: 1000) temperature: Sampling temperature 0.0-1.0 (default: 0.5) extraction_passes: Number of extraction passes for better recall (default: 1) max_workers: Max parallel workers (default: 10)

Returns: Dictionary containing extracted entities with source locations and metadata

Raises: ToolError: If extraction fails due to invalid parameters or API issues

Input Schema

NameRequiredDescriptionDefault
examplesYes
extraction_passesNo
max_char_bufferNo
max_workersNo
model_idNogemini-2.5-flash
prompt_descriptionYes
temperatureNo
textYes

Input Schema (JSON Schema)

{ "properties": { "examples": { "items": { "additionalProperties": true, "type": "object" }, "title": "Examples", "type": "array" }, "extraction_passes": { "default": 1, "title": "Extraction Passes", "type": "integer" }, "max_char_buffer": { "default": 1000, "title": "Max Char Buffer", "type": "integer" }, "max_workers": { "default": 10, "title": "Max Workers", "type": "integer" }, "model_id": { "default": "gemini-2.5-flash", "title": "Model Id", "type": "string" }, "prompt_description": { "title": "Prompt Description", "type": "string" }, "temperature": { "default": 0.5, "title": "Temperature", "type": "number" }, "text": { "title": "Text", "type": "string" } }, "required": [ "text", "prompt_description", "examples" ], "type": "object" }

Other Tools from LangExtract MCP Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/larsenweigle/langextract-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server