Supports containerized deployment with Docker, allowing the MCP server to be packaged and run in isolated containers
Integrates with GitHub for CI/CD workflows through GitHub Actions, providing automated testing and deployment pipelines
Available for installation through PyPI, enabling easy distribution and installation via pip
Built on Python 3.11+, utilizing Python's ecosystem for prompt optimization functionality
🚀 Prompt Optimizer MCP
A Model Context Protocol (MCP) server that provides intelligent tools for optimizing and scoring LLM prompts using deterministic heuristics.
🎯 Overview
The Prompt Optimizer MCP server offers two powerful tools:
optimize_prompt
- Generate 3 optimized variants of a raw LLM prompt in different stylesscore_prompt
- Evaluate the effectiveness of an improved prompt relative to the original
Perfect for developers, content creators, and AI practitioners who want to improve their prompt engineering workflow.
✨ Features
🎨 Prompt Optimization Styles
- Creative: Enhanced with descriptive adjectives and engaging language
- Precise: Concise and focused, removing redundant words
- Fast: Optimized for quick processing with shorter synonyms
📊 Intelligent Scoring Algorithm
The scoring system evaluates prompts based on:
- Length optimization (40%): Prefers shorter, more concise prompts
- Keyword preservation (30%): Maintains important terms from the original
- Clarity improvement (30%): Reduces redundancy and improves structure
🔧 Technical Features
- ✅ Stateless: No external dependencies or state management
- ✅ Deterministic: Same inputs always produce same outputs
- ✅ Error-free: Comprehensive input validation and error handling
- ✅ Fast: Simple heuristics for quick processing
- ✅ Extensible: Easy to add new styles and scoring metrics
- ✅ Dual Transport: Supports both STDIO (MCP) and HTTP (deployment)
📁 Project Structure
🚀 Quick Start
1. Clone the Repository
2. Install Dependencies
3. Run Tests
4. Start the Server
🛠️ Installation
Prerequisites
- Python 3.11 or higher
- pip package manager
Install Dependencies
⚙️ Configuration
For Cursor IDE
Create .cursor/mcp.json
:
For Other MCP Clients
Configure your MCP client to use:
- Command:
python server.py
- Transport: STDIO (default)
📖 Usage Examples
Using the MCP Server
Once configured, you can use the tools through any MCP client:
Optimize a Prompt
Score a Prompt
HTTP API Usage
When deployed, the server also provides HTTP endpoints:
Direct Python Usage
🧪 Testing
Run the comprehensive test suite:
🚀 Deployment
Automated Deployment
Use the deployment script:
This will:
- Run all tests
- Install dependencies
- Run linting checks
- Build Docker image (if available)
- Create deployment package
Manual Deployment
Deploy to Smithery
- Install Smithery CLI:
- Authenticate:
- Deploy:
Deploy with Docker
Deploy to Other Platforms
The server supports both STDIO (for MCP clients) and HTTP (for web deployment) transports:
- STDIO Mode:
python server.py
(for MCP clients) - HTTP Mode:
python start.py
(for web deployment)
Your MCP server will be available at: https://prompt-optimizer-mcp.smithery.ai
For detailed deployment instructions, see DEPLOYMENT.md.
🔧 Development
Adding New Optimization Styles
- Add the new style to the
Literal
type inserver.py
- Implement the style function in
tools/optimize.py
- Add corresponding tests in
tests/test_optimize.py
Extending the Scoring Algorithm
Modify the score_prompt
function in tools/optimize.py
to include additional metrics or adjust weights.
Running Locally
📊 Performance
- Response Time: < 100ms for most operations
- Memory Usage: ~50MB typical
- CPU Usage: Minimal (stateless operations)
- Scalability: Auto-scales from 1-5 replicas on Smithery
🤝 Contributing
We welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Development Setup
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Model Context Protocol for the MCP specification
- MCP Python SDK for the server framework
- Smithery for deployment platform
📞 Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: DEPLOYMENT.md
⭐ Star History
Made with ❤️ for the AI community
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A Model Context Protocol (MCP) server that provides intelligent tools for optimizing and scoring LLM prompts using deterministic heuristics.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that enables LLMs to interact directly the documents that they have on-disk through agentic RAG and hybrid search in LanceDB. Ask LLMs questions about the dataset as a whole or about specific documents.Last updated -664TypeScriptMIT License
- -securityAlicense-qualityAn MCP server that allows agents to test and compare LLM prompts across OpenAI and Anthropic models, supporting single tests, side-by-side comparisons, and multi-turn conversations.Last updated -PythonMIT License
- AsecurityFlicenseAqualityA Model Context Protocol server that provides specialized prompt suggestions for backend development, frontend development, and general tasks to help LLMs generate better content.Last updated -441TypeScript
- -securityFlicense-qualityA specialized server that enables LLMs to gather specific information through sequential questioning, implementing the MCP standard for seamless integration with LLM clients.Last updated -1Python