Skip to main content
Glama

MCP Elicitations Demo Server

by soriat

sampleLLM

Generate text responses from a language model by providing prompts and setting token limits using MCP's sampling feature on the Elicitations Demo Server.

Instructions

Samples from an LLM using MCP's sampling feature

Input Schema

NameRequiredDescriptionDefault
maxTokensNoMaximum number of tokens to generate
promptYesThe prompt to send to the LLM

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "maxTokens": { "default": 100, "description": "Maximum number of tokens to generate", "type": "number" }, "prompt": { "description": "The prompt to send to the LLM", "type": "string" } }, "required": [ "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/soriat/soria-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server