Skip to main content
Glama

LangExtract MCP Server

by larsenweigle

extract_from_text

Extract structured data from unstructured text using customizable instructions and examples. Maps extracted entities to their exact source locations for precise results.

Instructions

Extract structured information from text using langextract.

Uses Large Language Models to extract structured information from unstructured text based on user-defined instructions and examples. Each extraction is mapped to its exact location in the source text for precise source grounding.

Args: text: The text to extract information from prompt_description: Clear instructions for what to extract examples: List of example extractions to guide the model config: Configuration parameters for the extraction

Returns: Dictionary containing extracted entities with source locations and metadata

Raises: ToolError: If extraction fails due to invalid parameters or API issues

Input Schema

NameRequiredDescriptionDefault
configNo
examplesYes
prompt_descriptionYes
textYes

Input Schema (JSON Schema)

{ "$defs": { "ExtractionConfig": { "description": "Configuration for extraction parameters.", "properties": { "extraction_passes": { "default": 1, "description": "Number of extraction passes for better recall", "title": "Extraction Passes", "type": "integer" }, "max_char_buffer": { "default": 1000, "description": "Max characters per chunk", "title": "Max Char Buffer", "type": "integer" }, "max_workers": { "default": 10, "description": "Max parallel workers", "title": "Max Workers", "type": "integer" }, "model_id": { "default": "gemini-2.5-flash", "description": "LLM model to use", "title": "Model Id", "type": "string" }, "temperature": { "default": 0.5, "description": "Sampling temperature (0.0-1.0)", "title": "Temperature", "type": "number" } }, "title": "ExtractionConfig", "type": "object" }, "ExtractionExample": { "description": "Model for extraction examples.", "properties": { "extractions": { "description": "Expected extractions", "items": { "additionalProperties": true, "type": "object" }, "title": "Extractions", "type": "array" }, "text": { "description": "Example text", "title": "Text", "type": "string" } }, "required": [ "text", "extractions" ], "title": "ExtractionExample", "type": "object" } }, "properties": { "config": { "$ref": "#/$defs/ExtractionConfig", "default": { "extraction_passes": 1, "max_char_buffer": 1000, "max_workers": 10, "model_id": "gemini-2.5-flash", "temperature": 0.5 }, "title": "Config" }, "examples": { "items": { "$ref": "#/$defs/ExtractionExample" }, "title": "Examples", "type": "array" }, "prompt_description": { "title": "Prompt Description", "type": "string" }, "text": { "title": "Text", "type": "string" } }, "required": [ "text", "prompt_description", "examples" ], "type": "object" }
Install Server

Other Tools from LangExtract MCP Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/larsenweigle/langextract-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server