Skip to main content
Glama

PRIMS – Python Runtime Interpreter MCP Server

PRIMS is a tiny open-source Model Context Protocol (MCP) server that lets LLM agents run arbitrary Python code in a secure, throw-away sandbox.

One tool, one job. Exposes a single MCP tool – run_code – that executes user-supplied Python and streams back stdout / stderr.

Isolated & reproducible. Each call spins up a fresh virtual-env, installs any requested pip packages, mounts optional read-only files, then nukes the workspace.

Zero config. Works over MCP/stdio or drop it in Docker.


Quick-start

1. Local development environment

chmod +x scripts/setup_env.sh # once, to make the script executable ./scripts/setup_env.sh # creates .venv & installs deps # activate the venv in each new shell source .venv/bin/activate

2. Launch the server

python -m server.main # binds http://0.0.0.0:9000/mcp

3. Docker

# Quick one-liner (build + run) chmod +x scripts/docker_run.sh ./scripts/docker_run.sh # prints the MCP URL when ready

Examples

List available tools

You can use the provided script to list all tools exposed by the server:

python examples/list_tools.py

Expected output (tool names and descriptions may vary):

Available tools: - run_code: Execute Python code in a secure sandbox with optional dependencies & file mounts. - list_dir: List files/directories in your session workspace. - preview_file: Preview up to 8 KB of a text file from your session workspace. - persist_artifact: Upload an output/ file to a presigned URL for permanent storage. - mount_file: Download a remote file once per session to `mounts/<path>`.

Run code via the MCP server

python examples/run_code.py

Mount a dataset once & reuse it

python examples/mount_and_run.py

This mounts a CSV with mount_file and then reads it inside run_code without re-supplying the URL.

Inspect your session workspace

python examples/inspect_workspace.py

This shows how to use the list_dir and preview_file tools to browse files your code created.

Persist an artifact to permanent storage

The persist_artifact tool uploads a file from your output/ directory to a presigned URL.

Example (Python):

await client.call_tool("persist_artifact", { "relative_path": "plots/plot.png", "presigned_url": "https://bucket.s3.amazonaws.com/...signature...", })

Download an artifact

Small artifacts can be fetched directly:

curl -H "mcp-session-id: <your-session-id>" \ http://localhost:9000/artifacts/plots/plot.png -o plot.png

Available tools

ToolPurpose
run_codeExecute Python in an isolated sandbox with optional pip deps.
list_dirList files/directories inside your session workspace.
preview_fileReturn up to 8 KB of a text file for quick inspection.
persist_artifactUpload an output/ file to a client-provided presigned URL.
mount_fileDownload a remote file once per session to mounts/<path>.

See the examples/ directory for end-to-end demos.

Contributing

Contributions are welcome! Feel free to open issues, suggest features, or submit pull requests to help improve PRIMS.

If you find this project useful, please consider leaving a ⭐ to show your support.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

PRIMS is a lightweight, open-source Model Context Protocol (MCP) server that lets LLM agents safely execute arbitrary Python code in a secure, throw-away sandbox.

  1. Quick-start
    1. Local development environment
    2. Launch the server
    3. Docker
  2. Examples
    1. List available tools
    2. Run code via the MCP server
    3. Mount a dataset once & reuse it
    4. Inspect your session workspace
    5. Persist an artifact to permanent storage
    6. Download an artifact
  3. Available tools
    1. Contributing

      Related MCP Servers

      • A
        security
        A
        license
        A
        quality
        A secure terminal execution server that enables controlled command execution with security features and resource limits via the Model Context Protocol (MCP).
        Last updated -
        1
        12
        1
        JavaScript
        MIT License
        • Apple
      • -
        security
        F
        license
        -
        quality
        A Model Context Protocol server implementation that enables seamless integration with Claude and other MCP-compatible clients to access Prem AI's language models, RAG capabilities, and document management features.
        Last updated -
        JavaScript
      • -
        security
        A
        license
        -
        quality
        Model Context Protocol server to run Python code in a sandbox.
        Last updated -
        1,462
        10,593
        Python
        MIT License
      • A
        security
        A
        license
        A
        quality
        Postgres Pro is an open source Model Context Protocol (MCP) server built to support you and your AI agents throughout the entire development process—from initial coding, through testing and deployment, and to production tuning and maintenance.
        Last updated -
        9
        228
        Python
        MIT License
        • Apple
        • Linux

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/hileamlakB/PRIMS'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server