Skip to main content
Glama

Gemini MCP

by emmron

mcp__gemini__ai_chat

Engage in AI-driven conversations with customizable model selection. Send messages, provide context, and choose models for tailored interactions on the Gemini MCP server.

Instructions

AI conversation with model selection

Input Schema

NameRequiredDescriptionDefault
contextNoAdditional context
messageYesMessage for AI
modelNoModel typemain

Input Schema (JSON Schema)

{ "$schema": "https://json-schema.org/draft/2020-12/schema", "properties": { "context": { "description": "Additional context", "type": "string" }, "message": { "description": "Message for AI", "type": "string" }, "model": { "default": "main", "description": "Model type", "type": "string" } }, "required": [ "message" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emmron/gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server