Skip to main content
Glama

MCP Conversation Server

A Model Context Protocol (MCP) server implementation for managing conversations with OpenRouter's language models. This server provides a standardized interface for applications to interact with various language models through a unified conversation management system.

Features

  • MCP Protocol Support

    • Full MCP protocol compliance

    • Resource management and discovery

    • Tool-based interaction model

    • Streaming response support

    • Error handling and recovery

  • OpenRouter Integration

    • Support for all OpenRouter models

    • Real-time streaming responses

    • Automatic token counting

    • Model context window management

    • Available models include:

      • Claude 3 Opus

      • Claude 3 Sonnet

      • Llama 2 70B

      • And many more from OpenRouter's catalog

  • Conversation Management

    • Create and manage multiple conversations

    • Support for system messages

    • Message history tracking

    • Token usage monitoring

    • Conversation filtering and search

  • Streaming Support

    • Real-time message streaming

    • Chunked response handling

    • Token counting

  • File System Persistence

    • Conversation state persistence

    • Configurable storage location

    • Automatic state management

Related MCP server: MCP TapData Server

Installation

npm install mcp-conversation-server

Configuration

Configuration

All configuration for the MCP Conversation Server is now provided via YAML. Please update the config/models.yaml file with your settings. For example:

# MCP Server Configuration openRouter: apiKey: "YOUR_OPENROUTER_API_KEY" # Replace with your actual OpenRouter API key. persistence: path: "./conversations" # Directory for storing conversation data. models: # Define your models here 'provider/model-name': id: 'provider/model-name' contextWindow: 123456 streaming: true temperature: 0.7 description: 'Model description' # Default model to use if none specified defaultModel: 'provider/model-name'

Server Configuration

The MCP Conversation Server now loads all its configuration from the YAML file. In your application, you can load the configuration as follows:

const config = await loadModelsConfig(); // Loads openRouter, persistence, models, and defaultModel settings from 'config/models.yaml'

Note: Environment variables are no longer required as all configuration is provided via the YAML file.

Usage

Basic Server Setup

import { ConversationServer } from 'mcp-conversation-server'; const server = new ConversationServer(config); server.run().catch(console.error);

Available Tools

The server exposes several MCP tools:

  1. create-conversation

    { provider: 'openrouter', // Provider is always 'openrouter' model: string, // OpenRouter model ID (e.g., 'anthropic/claude-3-opus-20240229') title?: string; // Optional conversation title }
  2. send-message

    { conversationId: string; // Conversation ID content: string; // Message content stream?: boolean; // Enable streaming responses }
  3. list-conversations

    { filter?: { model?: string; // Filter by model startDate?: string; // Filter by start date endDate?: string; // Filter by end date } }

Resources

The server provides access to several resources:

  1. conversation://{id}

    • Access specific conversation details

    • View message history

    • Check conversation metadata

  2. conversation://list

    • List all active conversations

    • Filter conversations by criteria

    • Sort by recent activity

Development

Building

npm run build

Running Tests

npm test

Debugging

The server provides several debugging features:

  1. Error Logging

    • All errors are logged with stack traces

    • Token usage tracking

    • Rate limit monitoring

  2. MCP Inspector

    npm run inspector

    Use the MCP Inspector to:

    • Test tool execution

    • View resource contents

    • Monitor message flow

    • Validate protocol compliance

  3. Provider Validation

    await server.providerManager.validateProviders();

    Validates:

    • API key validity

    • Model availability

    • Rate limit status

Troubleshooting

Common issues and solutions:

  1. OpenRouter Connection Issues

    • Verify your API key is valid

    • Check rate limits on OpenRouter's dashboard

    • Ensure the model ID is correct

    • Monitor credit usage

  2. Message Streaming Errors

    • Verify model streaming support

    • Check connection stability

    • Monitor token limits

    • Handle timeout settings

  3. File System Errors

    • Check directory permissions

    • Verify path configuration

    • Monitor disk space

    • Handle concurrent access

Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Commit your changes

  4. Push to the branch

  5. Create a Pull Request

License

ISC License

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-conversation-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server