query_local_ai
Access and query a local AI model via Ollama for reasoning assistance within the Enhanced Architecture MCP server. Input prompts to generate responses, customize models, and adjust temperature for tailored results.
Instructions
Query local AI model via Ollama for reasoning assistance
Input Schema
Name | Required | Description | Default |
---|---|---|---|
model | No | Model name (default: architecture-reasoning:latest) | architecture-reasoning:latest |
prompt | Yes | The reasoning prompt to send to local AI | |
temperature | No | Temperature for response (0.1-1.0) |