Why this server?
This server wraps the Azure CLI, allowing LLMs like Claude to generate and execute Azure CLI commands, including looking at logs.
Why this server?
This server allows executing shell commands within a secure Docker container, providing a controlled environment for AI models.
Why this server?
This server integrates with Sumo Logic's API to enable log search with configurable queries and time ranges.
Why this server?
A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
Why this server?
Opens a browser to monitor and retrieve console logs and network requests, providing structured data about web page behavior to LLMs.
Why this server?
This acts as a proxy server that combines multiple MCP servers into a single interface.