Enables querying Google's Gemini language models (including Gemini 2.5 Flash and Gemini 2.0 Flash) through OpenAI-compatible API endpoints
Enables interaction with locally-hosted language models through Ollama's OpenAI-compatible API endpoint for private AI conversations
Provides direct integration with OpenAI's language models including GPT-4 and GPT-3.5 for AI conversations and queries
Provides access to Perplexity's online language models with web search capabilities through OpenAI-compatible endpoints
🦆 MCP Rubber Duck
An MCP (Model Context Protocol) server that acts as a bridge to query multiple OpenAI-compatible LLMs. Just like rubber duck debugging, explain your problems to various AI "ducks" and get different perspectives!
Features
🔌 Universal OpenAI Compatibility: Works with any OpenAI-compatible API endpoint
🦆 Multiple Ducks: Configure and query multiple LLM providers simultaneously
💬 Conversation Management: Maintain context across multiple messages
🏛️ Duck Council: Get responses from all your configured LLMs at once
💾 Response Caching: Avoid duplicate API calls with intelligent caching
🔄 Automatic Failover: Falls back to other providers if primary fails
📊 Health Monitoring: Real-time health checks for all providers
🔗 MCP Bridge: Connect ducks to other MCP servers for extended functionality
🛡️ Granular Security: Per-server approval controls with session-based approvals
🎨 Fun Duck Theme: Rubber duck debugging with personality!
Supported Providers
Any provider with an OpenAI-compatible API endpoint, including:
OpenAI (GPT-4, GPT-3.5)
Google Gemini (Gemini 2.5 Flash, Gemini 2.0 Flash)
Anthropic (via OpenAI-compatible endpoints)
Groq (Llama, Mixtral, Gemma)
Together AI (Llama, Mixtral, and more)
Perplexity (Online models with web search)
Anyscale (Open source models)
Azure OpenAI (Microsoft-hosted OpenAI)
Ollama (Local models)
LM Studio (Local models)
Custom (Any OpenAI-compatible endpoint)
Quick Start
For Claude Desktop Users
👉 Complete Claude Desktop setup instructions below in
Installation
Prerequisites
Node.js 20 or higher
npm or yarn
At least one API key for a supported provider
Installation Methods
Option 1: Install from NPM
Option 2: Install from Source
Configuration
Method 1: Environment Variables
Create a .env
file in the project root:
Note: Duck nicknames are completely optional! If you don't set them, you'll get the charming defaults (GPT Duck, Gemini Duck, etc.). If you use a config.json
file, those nicknames take priority over environment variables.
Method 2: Configuration File
Create a config/config.json
file based on the example:
Claude Desktop Configuration
This is the most common setup method for using MCP Rubber Duck with Claude Desktop.
Step 1: Build the Project
First, ensure the project is built:
Step 2: Configure Claude Desktop
Edit your Claude Desktop config file:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
Windows:
%APPDATA%\Claude\claude_desktop_config.json
Add the MCP server configuration:
Important: Replace the placeholder API keys with your actual keys:
your-openai-api-key-here
→ Your OpenAI API key (starts withsk-
)your-gemini-api-key-here
→ Your Gemini API key from Google AI Studio
Note: MCP_SERVER: "true"
is required - this tells rubber-duck to run as an MCP server for any MCP client (not related to the MCP Bridge feature).
Step 3: Restart Claude Desktop
Completely quit Claude Desktop (⌘+Q on Mac)
Launch Claude Desktop again
The MCP server should connect automatically
Step 4: Test the Integration
Once restarted, test these commands in Claude:
Check Duck Health
Should show:
✅ GPT Duck (openai) - Healthy
✅ Gemini Duck (gemini) - Healthy
List Available Models
Ask a Specific Duck
Compare Multiple Ducks
Test Specific Models
Troubleshooting Claude Desktop Setup
If Tools Don't Appear
Check API Keys: Ensure your API keys are correctly entered without typos
Verify Build: Run
ls -la dist/index.js
to confirm the project built successfullyCheck Logs: Look for errors in Claude Desktop's developer console
Restart: Fully quit and restart Claude Desktop after config changes
Connection Issues
Config File Path: Double-check you're editing the correct config file path
JSON Syntax: Validate your JSON syntax (no trailing commas, proper quotes)
Absolute Paths: Ensure you're using the full absolute path to
dist/index.js
File Permissions: Verify Claude Desktop can read the dist directory
Health Check Failures
If ducks show as unhealthy:
API Keys: Verify keys are valid and have sufficient credits/quota
Network: Check internet connection and firewall settings
Rate Limits: Some providers have strict rate limits for new accounts
MCP Bridge - Connect to Other MCP Servers
The MCP Bridge allows your ducks to access tools from other MCP servers, extending their capabilities beyond just chat. Your ducks can now search documentation, access files, query APIs, and much more!
Note: This is different from the MCP server integration above:
MCP Bridge (
MCP_BRIDGE_ENABLED
): Ducks USE external MCP servers as clientsMCP Server (
MCP_SERVER
): Rubber-duck SERVES as an MCP server to any MCP client
Quick Setup
Add these environment variables to enable MCP Bridge:
Approval Modes
always
: Every tool call requires approval (with session-based memory)
First use of a tool → requires approval
Subsequent uses of the same tool → automatic (until restart)
trusted
: Only untrusted tools require approval
Tools in trusted lists execute immediately
Unknown tools require approval
never
: All tools execute immediately (use with caution)
Per-Server Trusted Tools
Configure trust levels per MCP server for granular security:
MCP Server Configuration
Configure MCP servers using environment variables:
HTTP Servers
STDIO Servers
Example: Enable Context7 Documentation
Now your ducks can search and retrieve documentation from Context7:
💡 Token Optimization Benefits
Smart Token Management: Ducks can retrieve comprehensive data from MCP servers but return only the essential information you need, saving tokens in your host LLM conversations:
Ask for specifics: "Find TypeScript interfaces documentation and return only the core concepts"
Duck processes full docs: Accesses complete documentation from Context7
Returns condensed results: Provides focused, relevant information while filtering out unnecessary details
Token savings: Reduces response size by 70-90% compared to raw documentation dumps
Example Workflow:
Session-Based Approvals
When using always
mode, the system remembers your approvals:
First time: "Duck wants to use
search-docs
- Approve? ✅"Next time: Duck uses
search-docs
automatically (no new approval needed)Different tool: "Duck wants to use
get-examples
- Approve? ✅"Restart: Session memory clears, start over
This eliminates approval fatigue while maintaining security!
Available Tools (Enhanced with MCP)
🦆 ask_duck
Ask a single question to a specific LLM provider. When MCP Bridge is enabled, ducks can automatically access tools from connected MCP servers.
💬 chat_with_duck
Have a conversation with context maintained across messages.
🧹 clear_conversations
Clear all conversation history and start fresh. Useful when switching topics or when context becomes too large.
📋 list_ducks
List all configured providers and their health status.
📊 list_models
List available models for LLM providers.
🔍 compare_ducks
Ask the same question to multiple providers simultaneously.
🏛️ duck_council
Get responses from all configured ducks - like a panel discussion!
Usage Examples
Basic Query
Conversation
Compare Responses
Duck Council
Provider-Specific Setup
Ollama (Local)
LM Studio (Local)
Download LM Studio from https://lmstudio.ai/
Load a model in LM Studio
Start the local server (provides OpenAI-compatible endpoint at localhost:1234/v1)
Google Gemini
Get API key from Google AI Studio
Add to environment:
GEMINI_API_KEY=...
Uses OpenAI-compatible endpoint (beta)
Groq
Get API key from https://console.groq.com/keys
Add to environment:
GROQ_API_KEY=gsk_...
Together AI
Get API key from https://api.together.xyz/
Add to environment:
TOGETHER_API_KEY=...
Verifying OpenAI Compatibility
To check if a provider is OpenAI-compatible:
Look for
/v1/chat/completions
endpoint in their API docsCheck if they support the OpenAI SDK
Test with curl:
Development
Run in Development Mode
Run Tests
Lint Code
Type Checking
Docker Support
MCP Rubber Duck provides multi-platform Docker support, working on macOS (Intel & Apple Silicon), Linux (x86_64 & ARM64), Windows (WSL2), and Raspberry Pi 3+.
Quick Start with Pre-built Image
The easiest way to get started is with our pre-built multi-architecture image:
Platform-Specific Deployment
🖥️ Desktop/Server (macOS, Linux, Windows)
🥧 Raspberry Pi
🌐 Remote Deployment via SSH
Universal Deployment Script
The scripts/deploy.sh
script auto-detects your platform and applies optimal settings:
Available options:
--mode
:docker
(default),local
, orssh
--platform
:pi
,desktop
, orauto
(default)--profile
:lightweight
,desktop
,with-ollama
--ssh-host
: For remote deployment
Platform-Specific Configuration
Raspberry Pi (Memory-Optimized)
Desktop/Server (High-Performance)
Docker Compose Profiles
Build Multi-Architecture Images
For developers who want to build and publish their own multi-architecture images:
Claude Desktop with Remote Docker
Connect Claude Desktop to MCP Rubber Duck running on a remote system:
Platform Compatibility
Platform | Architecture | Status | Notes |
macOS Intel | AMD64 | ✅ Full | Via Docker Desktop |
macOS Apple Silicon | ARM64 | ✅ Full | Native ARM64 support |
Linux x86_64 | AMD64 | ✅ Full | Direct Docker support |
Linux ARM64 | ARM64 | ✅ Full | Servers, Pi 4+ |
Raspberry Pi 3+ | ARM64 | ✅ Optimized | Memory-limited config |
Windows | AMD64 | ✅ Full | Via Docker Desktop + WSL2 |
Manual Docker Commands
If you prefer not to use docker-compose:
Architecture
Troubleshooting
Provider Not Working
Check API key is correctly set
Verify endpoint URL is correct
Run health check:
list_ducks({ check_health: true })
Check logs for detailed error messages
Connection Issues
For local providers (Ollama, LM Studio), ensure they're running
Check firewall settings for local endpoints
Verify network connectivity to cloud providers
Rate Limiting
Enable caching to reduce API calls
Configure failover to alternate providers
Adjust
max_retries
andtimeout
settings
Contributing
🦆 Want to help make our duck pond better?
We love contributions! Whether you're fixing bugs, adding features, or teaching our ducks new tricks, we'd love to have you join the flock.
Check out our Contributing Guide to get started. We promise it's more fun than a regular contributing guide - it has ducks! 🦆
Quick start for contributors:
Fork the repository
Create a feature branch
Follow our conventional commit guidelines
Add tests for new functionality
Submit a pull request
License
MIT License - see LICENSE file for details
Acknowledgments
Inspired by the rubber duck debugging method
Built on the Model Context Protocol (MCP)
Uses OpenAI SDK for universal compatibility
📝 Changelog
See CHANGELOG.md for a detailed history of changes and releases.
📦 Registry & Directory
MCP Rubber Duck is available through multiple channels:
NPM Package: npmjs.com/package/mcp-rubber-duck
Docker Images: ghcr.io/nesquikm/mcp-rubber-duck
MCP Registry: Official MCP server
io.github.nesquikm/rubber-duck
Glama Directory: glama.ai/mcp/servers/@nesquikm/mcp-rubber-duck
Awesome MCP Servers: Listed in the community directory
Support
Report issues: https://github.com/nesquikm/mcp-rubber-duck/issues
Documentation: https://github.com/nesquikm/mcp-rubber-duck/wiki
Discussions: https://github.com/nesquikm/mcp-rubber-duck/discussions
🦆 Happy Debugging with your AI Duck Panel! 🦆
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
An MCP server that acts as a bridge to query multiple OpenAI-compatible LLMs with MCP tool access. Just like rubber duck debugging, explain your problems to various AI "ducks" who can actually research and get different perspectives!
- Features
- Supported Providers
- Quick Start
- Installation
- Configuration
- Claude Desktop Configuration
- MCP Bridge - Connect to Other MCP Servers
- Quick Setup
- Approval Modes
- Per-Server Trusted Tools
- MCP Server Configuration
- Example: Enable Context7 Documentation
- 💡 Token Optimization Benefits
- Session-Based Approvals
- Available Tools (Enhanced with MCP)
- 🦆 ask_duck
- 💬 chat_with_duck
- 🧹 clear_conversations
- 📋 list_ducks
- 📊 list_models
- 🔍 compare_ducks
- 🏛️ duck_council
- Usage Examples
- Provider-Specific Setup
- Verifying OpenAI Compatibility
- Development
- Docker Support
- Architecture
- Troubleshooting
- Contributing
- License
- Acknowledgments
- 📝 Changelog
- 📦 Registry & Directory
- Support