Skip to main content
Glama

MCP Vertica

by p2k3m
Apache 2.0
  • Apple
  • Linux

mcp-vertica — Local NLP + REST for Vertica (no auth)

This runs entirely on your laptop: Vertica CE via Docker, a local REST API, and a terminal NLP→SQL command powered by a local LLM (Ollama). No auth, bound on 0.0.0.0 for convenience.

⚠️ Security is intentionally disabled for local demos. Do not expose to the public internet.

Prerequisites

  • Docker Desktop
  • Python 3.12+
  • uv (recommended) or pip
  • Ollama (for local LLM)
    • Mac: brew install ollamaollama serve &ollama pull llama3.1:8b
    • Windows: install Ollama app → run “Ollama” → in PowerShell: ollama pull llama3.1:8b
  • (Optional) A Vertica instance; we provide Docker.

1) Start Vertica locally

docker compose up -d # Wait until healthy (30–60s) docker ps

See Vertica container lifecycle for log inspection, shutdown, and database verification commands.

Defaults:

Host: localhost

Port: 5433

Database: VMart

User: dbadmin

Password: (empty)

Connection limit: 1 (override with VERTICA_CONNECTION_LIMIT)

2) Install & configure mcp-vertica

# Mac/Linux (uv) uv sync # Or pip: # python -m venv .venv && source .venv/bin/activate # pip install -e .

Set env (Mac/Linux bash or zsh):

export VERTICA_HOST=127.0.0.1 export VERTICA_PORT=5433 export VERTICA_DATABASE=VMart export VERTICA_USER=dbadmin export VERTICA_PASSWORD="" # Optional: override connection limit (default 1) export VERTICA_CONNECTION_LIMIT=5

Windows (PowerShell):

$env:VERTICA_HOST="127.0.0.1" $env:VERTICA_PORT="5433" $env:VERTICA_DATABASE="VMart" $env:VERTICA_USER="dbadmin" $env:VERTICA_PASSWORD="" $env:VERTICA_CONNECTION_LIMIT="5" # optional override (default 1)

Operation permissions

Global defaults for SQL operations can be controlled with environment variables:

export ALLOW_SELECT_OPERATION=true export ALLOW_INSERT_OPERATION=false export ALLOW_UPDATE_OPERATION=false export ALLOW_DELETE_OPERATION=false export ALLOW_DDL_OPERATION=false

You can override these on a per-schema basis using comma-separated schema:true|false pairs:

export SCHEMA_SELECT_PERMISSIONS="public:true,itsm:false" # SCHEMA_INSERT_PERMISSIONS, SCHEMA_UPDATE_PERMISSIONS, # SCHEMA_DELETE_PERMISSIONS and SCHEMA_DDL_PERMISSIONS work the same way

If no schema permissions are configured, the server logs a notice and all operations (SELECT, INSERT, UPDATE, DELETE and DDL) fall back to the global settings.

3) Seed ITSM/CMDB sample data

python scripts/seed_itsm.py # Creates schemas itsm/cmdb and loads ~2k incidents + CIs/changes/relations

4) REST API (no auth)

uvx mcp-vertica serve-rest --host 0.0.0.0 --port 8001

Test:

curl http://127.0.0.1:8001/api/health curl -X POST http://127.0.0.1:8001/api/query \ -H 'Content-Type: application/json' \ -d '{"sql":"SELECT COUNT(*) AS n FROM itsm.incident;"}'

NLP endpoint:

curl -X POST http://127.0.0.1:8001/api/nlp \ -H 'Content-Type: application/json' \ -d '{"question":"Top 5 incident categories this month", "execute": true}'

5) NLP from terminal

Start Ollama in background (if not already):

ollama serve & ollama pull llama3.1:8b

Examples:

# Ask anything; will generate Vertica SQL and run it uvx mcp-vertica nlp ask "Top 5 incident categories this month by count" # Create a table (mutations allowed) uvx mcp-vertica nlp ask "Create table staging.high_prio_incidents as P1 incidents last 7 days" # Dry-run (just show SQL) uvx mcp-vertica nlp ask --dry-run "List incidents joined to CI class and change window overlap" # Similar incidents uvx mcp-vertica nlp similar --incident-id INC000123 uvx mcp-vertica nlp similar --text "database timeout in payment service" --top-k 10

6) SSE MCP server (unchanged)

uvx mcp-vertica --port 8000 # runs SSE (0.0.0.0)

Vertica container lifecycle

Start the Vertica container:

docker compose up -d

Follow logs until the database is ready:

docker compose logs -f vertica

Verify connectivity:

docker exec vertica-ce vsql -U dbadmin -d VMart -c "SELECT 1"

Shut down the container:

docker compose down

Troubleshooting

If MCP client can’t connect: uv cache clean and retry.

If Vertica not ready: docker logs vertica-ce and re-run after healthy. To check databases:

admintools -t list_all_dbs # Name | Owner | State # VMart | dbadmin | Running

Inspect a specific database:

admintools -t list_db -d VMart # Database: VMart # State: Running # v_vmart_node0001 (UP)

If Ollama fails: ensure ollama serve is running and you pulled a model.

License

MIT

Deploy Server
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables interaction with Vertica databases through SQL queries, schema management, and bulk data operations. Supports connection pooling, SSL/TLS security, and configurable permissions for database operations.

  1. Prerequisites
    1. 1) Start Vertica locally
      1. 2) Install & configure mcp-vertica
        1. Operation permissions
      2. 3) Seed ITSM/CMDB sample data
        1. 4) REST API (no auth)
          1. 5) NLP from terminal
            1. 6) SSE MCP server (unchanged)
              1. Vertica container lifecycle
                1. Troubleshooting
                  1. License

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/p2k3m/vertica'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server