Skip to main content
Glama
ollama-entrypoint.sh285 B
#!/bin/sh ollama serve & # Give the API a moment to start sleep 5 # Check if the model is already downloaded if ! ollama list | grep -q "nomic-embed-text"; then ollama pull nomic-embed-text fi # Do not run the model interactively; the server will lazily load on /api/embed wait

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ZanzyTHEbar/mcp-memory-libsql-go'

If you have feedback or need assistance with the MCP directory API, please join our Discord server