prompt_from_file
Generate responses from multiple LLM models by sending a prompt stored in a file. Specify the absolute file path and optional models to streamline model testing and integration.
Instructions
Send a prompt from a file to multiple LLM models. IMPORTANT: You MUST provide an absolute file path (e.g., /path/to/file or C:\path\to\file), not a relative path.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
abs_file_path | Yes | Absolute path to the file containing the prompt (must be an absolute path, not relative) | |
models_prefixed_by_provider | No | List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models. |
Input Schema (JSON Schema)
{
"properties": {
"abs_file_path": {
"description": "Absolute path to the file containing the prompt (must be an absolute path, not relative)",
"title": "Abs File Path",
"type": "string"
},
"models_prefixed_by_provider": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "null"
}
],
"default": null,
"description": "List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.",
"title": "Models Prefixed By Provider"
}
},
"required": [
"abs_file_path"
],
"title": "PromptFromFileSchema",
"type": "object"
}