Skip to main content
Glama

IBM i MCP Server

Official
by IBM
.env.exampleโ€ข3.37 kB
# ============================================================ # IBM i MCP Client - Environment Configuration # ============================================================ # Copy this file to .env and configure based on your LLM provider # cp .env.example .env # ------------------------------------------------------------ # Option 1: OpenAI (Recommended for Getting Started) # ------------------------------------------------------------ # Get your API key from: https://platform.openai.com/api-keys # Cost: Pay per use (see OpenAI pricing) # Models: gpt-4o, gpt-4-turbo, gpt-3.5-turbo # OPENAI_API_KEY=sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # ------------------------------------------------------------ # Option 2: IBM WatsonX (Enterprise) # ------------------------------------------------------------ # Setup: # 1. Create IBM Cloud account: https://cloud.ibm.com/ # 2. Create WatsonX.ai service instance # 3. Get API key from: https://cloud.ibm.com/iam/apikeys # 4. Get project ID from your WatsonX project settings # # Models: meta-llama/llama-3-3-70b-instruct, ibm/granite-13b-chat-v2 # IBM_WATSONX_API_KEY=your_watsonx_api_key # IBM_WATSONX_PROJECT_ID=your_project_id # IBM_WATSONX_BASE_URL=https://us-south.ml.cloud.ibm.com # ------------------------------------------------------------ # Option 3: Ollama (Local/Free) # ------------------------------------------------------------ # Setup: # 1. Install Ollama: https://ollama.com/download # 2. Pull a model: ollama pull qwen2.5:latest # 3. Start Ollama: ollama serve # # No API key needed - runs completely locally # Models: qwen2.5:latest, llama3.2:latest, mistral:latest # ------------------------------------------------------------ # Option 4: Anthropic Claude (Alternative) # ------------------------------------------------------------ # Get your API key from: https://console.anthropic.com/ # Cost: Pay per use (see Anthropic pricing) # Models: claude-3-5-sonnet-20241022, claude-3-opus-20240229 # ANTHROPIC_API_KEY=sk-ant-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # ============================================================ # Development Settings (Optional) # ============================================================ # Enable debug logging # DEBUG=1 # Python log level # LOG_LEVEL=INFO # ============================================================ # Usage Examples # ============================================================ # # After configuring your .env file, run scripts like: # # Basic MCP client (no LLM required): # uv run mcp_client.py # uv run list_tool_annotations.py # # AI Agent with OpenAI: # uv run agent.py --model-id "openai:gpt-4o" -p "Show system CPU usage" # # AI Agent with WatsonX: # uv run agent.py --model-id "watsonx:meta-llama/llama-3-3-70b-instruct" -p "List active jobs" # # AI Agent with Ollama (local): # uv run agent.py --model-id "ollama:qwen2.5:latest" -p "Check system status" # # Authenticated agent: # uv run test_auth_agent.py -p "Analyze system performance" # # ============================================================ # Security Notes # ============================================================ # - NEVER commit .env files to version control # - Add .env to .gitignore # - Use environment-specific tokens (dev/staging/prod) # - Rotate API keys and tokens regularly # - Use short-lived tokens for production environments

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/IBM/ibmi-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server