Skip to main content
Glama

PsyFlow-MCP

by TaskBeacon

taskbeacon-mcp

A model context protocol (MCP) for taskbeacon.


Overview

taskbeacon-mcp is a lightweight FastMCP server that lets a language-model clone, transform, download and localize taskbeacon task templates using a single entry-point tool.

This README provides instructions for setting up and using taskbeacon-mcp in different environments.


1 · Quick Start (Recommended)

The easiest way to use taskbeacon-mcp is with uvx. This tool automatically downloads the package from PyPI, installs it and its dependencies into a temporary virtual environment, and runs it in a single step. No manual cloning or setup is required.

1.1 · Prerequisites

Ensure you have uvx installed. If not, you can install it with pip:

pip install uvx

1.2 · LLM Tool Configuration (JSON)

To integrate taskbeacon-mcp with your LLM tool (like Gemini CLI or Cursor), use the following JSON configuration. This tells the tool how to run the server using uvx.

{ "name": "taskbeacon-mcp", "type": "stdio", "description": "Local FastMCP server for taskbeacon task operations. Uses uvx for automatic setup.", "isActive": true, "command": "uvx", "args": [ "taskbeacon-mcp" ] }

With this setup, the LLM can now use the taskbeacon-mcp tools.


2 · Manual Setup (For Developers)

This method is for developers who want to modify or contribute to the taskbeacon-mcp source code.

2.1 · Environment Setup

  1. Create a virtual environment and install dependencies: This project uses uv. Make sure you are in the project root directory.

    # Create and activate the virtual environment python -m venv .venv source .venv/bin/activate # On Windows, use: .venv\Scripts\activate # Install dependencies in editable mode pip install -e .

2.2 · Running Locally (StdIO)

This is the standard mode for local development, where the server communicates over STDIN/STDOUT.

  1. Launch the server:

    python taskbeacon_mcp/main.py
  2. LLM Tool Configuration (JSON): To use your local development server with an LLM tool, use the following configuration. Note that you should replace the example path in args with the absolute path to the main.py file on your machine.

    { "name": "taskbeacon-mcp_dev", "type": "stdio", "description": "Local development server for taskbeacon task operations.", "isActive": true, "command": "python", "args": [ "path\\to\\taskbeacon_mcp\\main.py" ] }

2.3 · Running as a Persistent Server (SSE)

For a persistent, stateful server, you can run taskbeacon-mcp using Server-Sent Events (SSE). This is ideal for production or when multiple clients need to interact with the same server instance.

  1. Modify In taskbeacon-mcp/main.py, change the last line from mcp.run(transport="stdio") to:

mcp.run(transport="sse", port=8000) ```

  1. Run the server:

    python taskbeacon-mcp/main.py

    The server will now be accessible at http://localhost:8000/mcp.

  2. LLM Tool Configuration (JSON): To connect an LLM tool to the running SSE server, use a configuration like this:

    { "name": "taskbeacon-mcp_sse", "type": "http", "description": "Persistent SSE server for taskbeacon task operations.", "isActive": true, "endpoint": "http://localhost:8000/mcp" }

3 · Conceptual Workflow

  1. User describes the task they want (e.g. “Make a Stroop out of Flanker”).

  2. LLM calls the build_task tool:

    • If the model already knows the best starting template it passes source_task.

    • Otherwise it omits source_task, receives a menu created by choose_template_prompt, picks a repo, then calls build_task again with that repo.

  3. The server clones the chosen template, returns a Stage 0→5 instruction prompt (transform_prompt) plus the local template path.

  4. The LLM edits files locally, optionally invokes localize to translate and adapt config.yaml, then zips / commits the new task.


4 · Exposed Tools

Tool

Arguments

Purpose / Return

build_task

target_task:str

,

source_task?:str

Main entry-point.

• With

source_task

→ clones repo and returns:

prompt

(Stage 0→5)

+

template_path

(local clone). • Without

source_task

→ returns

prompt_messages

from

choose_template_prompt

so the LLM can pick the best starting template, then call

build_task

again.

list_tasks

none

Returns an array of objects:

{ repo, readme_snippet, branches }

, where

branches

lists up to 20 branch names for that repo.

download_task

repo:str

Clones any template repo from the registry and returns its local path.

localize

task_path:str

,

target_language:str

,

voice?:str

Reads

config.yaml

, wraps it in

localize_prompt

, and returns

prompt_messages

. If a

voice

is not provided, it first calls

list_voices

to find suitable options. Also deletes old

_voice.mp3

files.

list_voices

filter_lang?:str

Returns a human-readable string of available text-to-speech voices from

taskbeacon

, optionally filtered by language (e.g., "ja", "en").


5 · Exposed Prompts

Prompt

Parameters

Description

transform_prompt

source_task

,

target_task

Single

User

message containing the full Stage 0→5 instructions to convert

source_task

into

target_task

.

choose_template_prompt

desc

,

candidates:list[{repo,readme_snippet}]

Three

User

messages: task description, template list, and selection criteria. The LLM must reply with

one repo name

or the literal word

NONE

.

localize_prompt

yaml_text

,

target_language

,

voice_options?

Two-message sequence: strict translation instruction + raw YAML. The LLM must return the fully-translated YAML body, adding the

voice: <short_name>

if suitable options were provided.


-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A lightweight FastMCP server that enables language models to discover, clone, transform, and localize PsyFlow task templates through a streamlined workflow with standardized tools.

  1. Overview
    1. 1 · Quick Start (Recommended)
      1. 1.1 · Prerequisites
      2. 1.2 · LLM Tool Configuration (JSON)
    2. 2 · Manual Setup (For Developers)
      1. 2.1 · Environment Setup
      2. 2.2 · Running Locally (StdIO)
      3. 2.3 · Running as a Persistent Server (SSE)
    3. 3 · Conceptual Workflow
      1. 4 · Exposed Tools
        1. 5 · Exposed Prompts

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            FastMCP is a comprehensive MCP server allowing secure and standardized data and functionality exposure to LLM applications, offering resources, tools, and prompt management for efficient LLM interactions.
            Last updated -
            3
            MIT License
          • -
            security
            F
            license
            -
            quality
            A production-ready MCP server built with FastAPI, providing an enhanced tool registry for creating, managing, and documenting AI tools for Large Language Models (LLMs).
            Last updated -
            34
          • A
            security
            A
            license
            A
            quality
            A starter template for building MCP servers with FastMCP, providing testing, linting, formatting, and NPM publishing setup.
            Last updated -
            1
            0
            MIT License
          • -
            security
            A
            license
            -
            quality
            A template MCP server that provides job searching tools (analyze descriptions, fetch postings, search opportunities) and basic image processing capabilities. Includes built-in authentication and is designed to work seamlessly with Puch AI.
            Last updated -
            55
            Apache 2.0

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/TaskBeacon/psyflow-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server