Skip to main content
Glama

Joke MCP Server

by ericqian77
integration-guide.md2.7 kB
# Integration Guide: Waiting Flow Jokes This note shows how to surface MCP jokes in an existing task runner so developers see a quick laugh while a background job finishes. ## 1. Start the MCP server ```bash LOG_VERBOSE=false ALLOW_NET=false node ./src/jokes-mcp.js ``` Keep the process running in the background (tmux tab, supervisor, etc.). ## 2. Ask for a joke during a wait If you do not keep the server running, spawn it on demand, read one joke, then exit: ```bash printf 'getJoke {"category":"general"}\n' | node ./src/jokes-mcp.js ``` For a long-running job, you can wrap the request in a helper: ```bash show_waiting_joke() { printf 'getJoke {"category":"programming"}\n' | node ./src/jokes-mcp.js } run_slow_build() { show_waiting_joke pnpm build } ``` ## 3. Integrate with the Codex CLI Add the server to your `codex.toml` (full block in README) and enable MCP hints: ```toml [mcp_servers.jokes] command = "node" args = ["./src/jokes-mcp.js"] startup_timeout_ms = 20000 ``` Codex automatically forwards the `health` probe and manages the process lifecycle; when your workflow enters a waiting period, call: ```bash codex mcp send jokes 'getJoke {"lang":"en"}' ``` ## 4. SDK/Script integration ### Node.js example ```js import { spawn } from 'node:child_process'; function requestJoke(vars = {}) { return new Promise((resolve, reject) => { const args = ['getJoke', JSON.stringify(vars)]; const proc = spawn('node', ['./src/jokes-mcp.js']); let output = ''; proc.stdout.on('data', (chunk) => { output += chunk; }); proc.on('close', () => { try { const lastLine = output.trim().split(/\r?\n/).at(-1); resolve(JSON.parse(lastLine)); } catch (error) { reject(error); } }); proc.stdin.write(`${args.join(' ')}\n`); proc.stdin.end(); }); } ``` Call `requestJoke({ category: 'programming', lang: 'en' })` before polling status updates. ### Python snippet ```python import json import subprocess def get_joke(payload=None): payload = payload or {"category": "programming"} cmd = ["node", "./src/jokes-mcp.js"] proc = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, text=True) proc.stdin.write(f"getJoke {json.dumps(payload)}\n") proc.stdin.close() result, _ = proc.communicate() return json.loads(result.strip().splitlines()[-1]) ``` ## 5. Tips - Keep `LOG_VERBOSE=false` unless debugging to avoid stderr noise. - When running offline CI, guarantee deterministic jokes by limiting to the local provider (`ALLOW_NET=false`). - Cache results if you request multiple jokes during the same wait loop to avoid respawning the process repeatedly.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ericqian77/joke-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server