Skip to main content
Glama

Blender MCP Server 🎨🤖

A highly optimized Model Context Protocol (MCP) server that connects AI Assistants (like Claude, Gemini, etc.) directly to Blender.

This allows you to control Blender using natural language to create objects, apply textures, manage scenes, and more, with a focus on token efficiency and speed.

✨ Features

  • 🚀 Optimized for AI: Uses compact JSON and output truncation to save 75%+ tokens compared to raw scripting.

  • 🔌 Direct Integration: Works with any MCP-compliant client (Cursor, Windsurf, etc.).

  • 🛠️ Powerful Tools:

    • create_primitive: Create Cubes, Spheres, Cones, etc.

    • transform_object: Move, Rotate, Scale.

    • apply_texture: Auto-Join objects, Smart UV Project, and apply textures in one go.

    • apply_material_preset: Apply Gold, Glass, Plastic, etc. instantly.

    • scatter_objects: Randomly distribute objects.

    • get_scene_info: See what's in your scene (with filtering).

  • UI Control: Dedicated panel in Blender to Start/Stop the server.


📦 Installation Guide

1. Prerequisites

  • Blender 3.0+ installed.

  • Python 3.10+ installed on your system.

  • Git (optional, for cloning).

2. Setup the Project

Clone this repository or download the files:

git clone https://github.com/yourusername/blender-mcp.git cd blender-mcp

3. Install Python Dependencies

Option A: Local Python (Simple) Install the mcp library required for the server:

pip install mcp

Option B: Using Docker (Clean & Isolated) 🐳 If you prefer not to install Python dependencies locally, you can use Docker.

  1. Build the Image:

    docker build -t blender-mcp .
  2. Update MCP Config: Use the docker command instead of python in your IDE config (see Configuration section below).

4. Install the Blender Addon

  1. Open Blender.

  2. Go to Edit > Preferences > Add-ons.

  3. Click Install... at the top right.

  4. Navigate to this folder and select addon.py.

  5. Click Install Add-on.

  6. ✅ Check the box next to Interface: MCP Connector to enable it.


⚙️ Configuration (AI IDE)

You need to tell your AI IDE (like Cursor or Windsurf) about this MCP server.

Option A: Local Python Config

{ "mcpServers": { "blender-mcp": { "command": "python", "args": [ "ABSOLUTE/PATH/TO/blender-mcp/server.py" ], "env": { "BLENDER_HOST": "127.0.0.1", "BLENDER_PORT": "9876" } } } }

Option B: Docker Config

If you built the Docker image, use this configuration:

{ "mcpServers": { "blender-mcp": { "command": "docker", "args": [ "run", "-i", "--rm", "--env", "BLENDER_HOST=host.docker.internal", "blender-mcp" ] } } }

Note: Replace


🚀 Usage

1. Start the Server in Blender

  1. Open the 3D Viewport in Blender.

  2. Press N to open the Sidebar.

  3. Click the MCP tab.

  4. Click Start Server ▶️.

    • Status should change to "Connected".

2. Connect & Prompt!

Restart your AI IDE to load the new MCP config. Now you can ask things like:

"Create a low poly tree with a brown cylinder trunk and a green cone top."

"Scatter 50 gold cubes in a radius of 10 meters."

"Clear the scene and make a red monkey head."

"Apply this texture 'C:/Textures/Wood.jpg' to the Cube."


🛠️ Troubleshooting

  • Connection Refused?

    • Make sure you clicked Start Server in the Blender MCP panel.

    • Check if the port 9876 is blocked by a firewall.

  • "Object not found" error?

    • Ensure the object name matches exactly (case-sensitive). Use get_scene_info to check names.

  • Textures not showing?

    • Make sure you are in Material Preview or Rendered mode in Blender (Press Z).


📜 License

MIT License

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mezallastudio/blender-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server