prompt_from_file_to_file
Send prompts from a file to multiple LLM models and store their responses in specified directories, using absolute paths for file input and output.
Instructions
Send a prompt from a file to multiple LLM models and save responses to files. IMPORTANT: You MUST provide absolute paths (e.g., /path/to/file or C:\path\to\file) for both file and output directory, not relative paths.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
abs_file_path | Yes | Absolute path to the file containing the prompt (must be an absolute path, not relative) | |
abs_output_dir | No | Absolute directory path to save the response files to (must be an absolute path, not relative. Default: current directory) | . |
models_prefixed_by_provider | No | List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models. |
Input Schema (JSON Schema)
{
"properties": {
"abs_file_path": {
"description": "Absolute path to the file containing the prompt (must be an absolute path, not relative)",
"title": "Abs File Path",
"type": "string"
},
"abs_output_dir": {
"default": ".",
"description": "Absolute directory path to save the response files to (must be an absolute path, not relative. Default: current directory)",
"title": "Abs Output Dir",
"type": "string"
},
"models_prefixed_by_provider": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "null"
}
],
"default": null,
"description": "List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.",
"title": "Models Prefixed By Provider"
}
},
"required": [
"abs_file_path"
],
"title": "PromptFromFileToFileSchema",
"type": "object"
}