Skip to main content
Glama
by haasonsaas

MCP Utility Tools

CI npm version License: MIT

A collection of utility tools for the Model Context Protocol (MCP) that provide caching, retry logic, batch operations, and rate limiting capabilities to enhance any MCP-based workflow.

Features

  • πŸ”„ Retry with Exponential Backoff - Automatically retry failed operations with configurable delays

  • πŸ’Ύ TTL-based Caching - Cache expensive operations with automatic expiration

  • πŸš€ Batch Operations - Process multiple operations in parallel with concurrency control

  • 🚦 Rate Limiting - Prevent API abuse with sliding window rate limiting

  • πŸ” Full TypeScript Support - Type-safe with comprehensive TypeScript definitions

Related MCP server: k8m

Installation

npm install mcp-utility-tools # or with yarn yarn add mcp-utility-tools # or with bun bun add mcp-utility-tools

Quick Start

1. Add to Claude Desktop

Add the utility tools to your Claude Desktop configuration:

{ "mcpServers": { "utility-tools": { "command": "npx", "args": ["mcp-utility-tools"] } } }

2. Use with Claude

Once configured, Claude can use these tools to enhance any workflow:

# Check cache before expensive operation cache_result = mcp_cache_get(key="api-response", namespace="github") if not cache_result["found"]: # Fetch data with retry response = fetch_with_retry("https://api.github.com/user/repos") # Cache for 5 minutes mcp_cache_put( key="api-response", value=response, ttl_seconds=300, namespace="github" )

Available Tools

πŸ”„ retry_operation

Retry operations with exponential backoff and jitter.

{ "tool": "retry_operation", "arguments": { "operation_id": "unique-operation-id", "operation_type": "http_request", "operation_data": { "url": "https://api.example.com/data", "method": "GET" }, "max_retries": 3, "initial_delay_ms": 1000 } }

Features:

  • Tracks retry attempts across multiple calls

  • Exponential backoff with configurable delays

  • Optional jitter to prevent thundering herd

  • Prevents duplicate retries for successful operations

πŸ’Ύ Cache Operations

cache_get

Retrieve values from cache with TTL support.

{ "tool": "cache_get", "arguments": { "key": "user-data-123", "namespace": "users" } }

cache_put

Store values with automatic expiration.

{ "tool": "cache_put", "arguments": { "key": "user-data-123", "value": { "name": "John", "role": "admin" }, "ttl_seconds": 300, "namespace": "users" } }

Features:

  • Namespace support to prevent key collisions

  • Automatic cleanup of expired entries

  • Configurable TTL (1 second to 24 hours)

  • Memory-efficient storage

πŸš€ batch_operation

Process multiple operations with controlled concurrency.

{ "tool": "batch_operation", "arguments": { "operations": [ { "id": "op1", "type": "fetch", "data": { "url": "/api/1" } }, { "id": "op2", "type": "fetch", "data": { "url": "/api/2" } }, { "id": "op3", "type": "fetch", "data": { "url": "/api/3" } } ], "concurrency": 2, "timeout_ms": 5000, "continue_on_error": true, "use_cache": true } }

Features:

  • Configurable concurrency (1-20 operations)

  • Per-operation timeout

  • Continue or fail-fast on errors

  • Optional result caching

  • Maintains order of results

🚦 rate_limit_check

Implement sliding window rate limiting.

{ "tool": "rate_limit_check", "arguments": { "resource": "api.github.com", "max_requests": 60, "window_seconds": 60, "increment": true } }

Features:

  • Per-resource tracking

  • Sliding window algorithm

  • Automatic reset after time window

  • Check without incrementing option

Integration Examples

With GitHub MCP Server

// Cache GitHub API responses async function getRepositoryWithCache(owner: string, repo: string) { const cacheKey = `github:${owner}/${repo}`; // Check cache first const cached = await mcp_cache_get({ key: cacheKey, namespace: "github" }); if (cached.found) { return cached.value; } // Fetch with retry const data = await retryableGitHubCall(owner, repo); // Cache for 10 minutes await mcp_cache_put({ key: cacheKey, value: data, ttl_seconds: 600, namespace: "github" }); return data; }

With Slack MCP Server

// Rate-limited Slack notifications async function sendSlackNotifications(messages: string[], channel: string) { for (const message of messages) { // Check rate limit const canSend = await mcp_rate_limit_check({ resource: `slack:${channel}`, max_requests: 10, window_seconds: 60, increment: true }); if (!canSend.allowed) { console.log(`Rate limited. Retry in ${canSend.reset_in_seconds}s`); await sleep(canSend.reset_in_seconds * 1000); } await mcp_slack_post_message({ channel_id: channel, text: message }); } }

Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ β”‚ β”‚ β”‚ β”‚ β”‚ β”‚ Claude/Client │────▢│ MCP Utility Tools│────▢│ Cache Storage β”‚ β”‚ β”‚ β”‚ β”‚ β”‚ (In-Memory) β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β”‚ β”‚ β–Ό β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Other MCP β”‚ β”‚ Retry/Rate β”‚ β”‚ Servers β”‚ β”‚ Limit Tracking β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Development

# Clone the repository git clone https://github.com/haasonsaas/mcp-utility-tools.git cd mcp-utility-tools # Install dependencies npm install # Build the project npm run build # Run tests npm test # Run in development mode npm run dev

Testing

Run the comprehensive test suite:

# Unit tests npm test # Integration tests with test harness npm run test:integration # Test with MCP Inspector npx @modelcontextprotocol/inspector build/index-v2.js

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Areas for Contribution

  • πŸ”Œ Storage Backends: Add Redis, SQLite support

  • πŸ”§ New Tools: Circuit breakers, request deduplication

  • πŸ“Š Metrics: Add performance tracking and analytics

  • 🌐 Examples: More integration examples with other MCP servers

License

MIT Β© Jonathan Haas

Acknowledgments

Built on top of the Model Context Protocol SDK by Anthropic.


Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/haasonsaas/mcp-utility-tools'

If you have feedback or need assistance with the MCP directory API, please join our Discord server