Skip to main content
Glama

Enterprise Code Search MCP Server

docker-compose.yml2.21 kB
version: '3.8' services: # ChromaDB Vector Database chromadb: image: chromadb/chroma:latest container_name: enterprise-chromadb ports: - "8000:8000" volumes: - chromadb_data:/chroma/chroma environment: - CHROMA_SERVER_HOST=0.0.0.0 - CHROMA_SERVER_HTTP_PORT=8000 - PERSIST_DIRECTORY=/chroma/chroma networks: - mcp-network restart: unless-stopped # Ollama for local embeddings ollama: image: ollama/ollama:latest container_name: enterprise-ollama ports: - "11434:11434" volumes: - ollama_data:/root/.ollama # Optional: mount GPU if available # - /dev/nvidia0:/dev/nvidia0 environment: - OLLAMA_HOST=0.0.0.0 networks: - mcp-network restart: unless-stopped # Uncomment if you have NVIDIA GPU # runtime: nvidia # deploy: # resources: # reservations: # devices: # - driver: nvidia # count: 1 # capabilities: [gpu] # Optional: Redis for caching redis: image: redis:7-alpine container_name: enterprise-redis ports: - "6379:6379" volumes: - redis_data:/data networks: - mcp-network restart: unless-stopped # Optional: PostgreSQL for metadata postgres: image: postgres:15-alpine container_name: enterprise-postgres ports: - "5432:5432" environment: POSTGRES_DB: enterprise_mcp POSTGRES_USER: mcp_user POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-secure_password_123} volumes: - postgres_data:/var/lib/postgresql/data networks: - mcp-network restart: unless-stopped # Setup service to pull Ollama models ollama-setup: image: ollama/ollama:latest container_name: ollama-setup depends_on: - ollama networks: - mcp-network environment: - OLLAMA_HOST=ollama:11434 volumes: - ./scripts/setup-ollama.sh:/setup.sh command: /bin/bash /setup.sh restart: "no" networks: mcp-network: driver: bridge volumes: chromadb_data: driver: local ollama_data: driver: local redis_data: driver: local postgres_data: driver: local

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/damian-pramparo/semantic-context-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server