Skip to main content
Glama

LLM Tool-Calling Assistant

by o6-webwork

This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.


📦 Features

  • 🔧 Tool execution through MCP server

  • 🧠 Local LLM integration via HTTP or OpenAI SDK

  • 📚 Knowledge base support (data.json)

  • ⚡ Supports stdio and sse transports


Related MCP server: MCP Documentation Server

🗂 Project Files

File

Description

server.py

Registers tools and starts MCP server

client-http.py

Uses

aiohttp

to communicate with local LLM

clientopenai.py

Uses OpenAI-compatible SDK for LLM + tool call logic

client-stdio.py

MCP client using stdio

client-see.py

MCP client using SSE

data.json

Q&A knowledge base


📥 Installation

Requirements

Python 3.8+

Install dependencies:

pip install -r requirements.txt

requirements.txt

aiohttp==3.11.18 nest_asyncio==1.6.0 python-dotenv==1.1.0 openai==1.77.0 mcp==1.6.0

🚀 Getting Started

1. Run the MCP server

python server.py

This launches your tool server with functions like add, multiply, and get_knowledge_base.

2. Start a client

Option A: HTTP client (local LLM via raw API)

python client-http.py

Option B: OpenAI SDK client

python client-openai.py

Option C: stdio transport

python client-stdio.py

Option D: SSE transport

Make sure server.py sets:

transport = "sse"

Then run:

python client-sse.py

💬 Example Prompts

Math Tool Call

What is 8 times 3?

Response:

Eight times three is 24.

Knowledge Base Question

What are the healthcare benefits available to employees in Singapore?

Response will include the relevant answer from data.json.


📁 Example: data.json

[ { "question": "What is Singapore's public holiday schedule?", "answer": "Singapore observes several public holidays..." }, { "question": "How do I apply for permanent residency in Singapore?", "answer": "Submit an online application via the ICA website..." } ]

🔧 Configuration

Inside client-http.py or clientopenai.py, update the following:

LOCAL_LLM_URL = "..." TOKEN = "your-api-token" LOCAL_LLM_MODEL = "your-model"

Make sure your LLM is serving OpenAI-compatible API endpoints.


🧹 Cleanup

Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.


🪪 License

MIT License. See LICENSE file.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/o6-webwork/mcp-template'

If you have feedback or need assistance with the MCP directory API, please join our Discord server