Skip to main content
Glama

Conduit

by mcpnow-io
ci.yml1.02 kB
name: CI on: [push, workflow_dispatch] permissions: contents: read jobs: ci: strategy: matrix: os: [ubuntu-24.04, ubuntu-22.04] runs-on: ${{matrix.os}} steps: - name: checkout uses: actions/checkout@v4 - name: Set up Docker uses: docker/setup-docker-action@v4 - name: Build and run Image run: | cd tests/ docker build -t phorge_debug . docker run -d -p 8080:80 --rm --name phorge_debug phorge_debug - name: Get API Token run: | PHABRICATOR_TOKEN=$(docker exec phorge_debug /usr/local/bin/get-api-token.sh) echo "PHABRICATOR_TOKEN=$PHABRICATOR_TOKEN" >> "$GITHUB_ENV" echo "PHABRICATOR_URL=http://127.0.0.1:8080/api/" >> "$GITHUB_ENV" - name: unittest run: | sudo apt install python3 python3-venv -y python3 -m venv ci source ci/bin/activate pip install .[dev] pre-commit run -a coverage run -m pytest -s

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mcpnow-io/conduit'

If you have feedback or need assistance with the MCP directory API, please join our Discord server