Skip to main content
Glama

MCP Server Basic Example

client-http.py1.34 kB
import asyncio import nest_asyncio from mcp import ClientSession from mcp.client.streamablehttp import http_client """ Make sure: 1. The server is running before running this script. 2. The server is configured to use Streamable HTTP transport. 3. The server is listening on port 8000. To run the server: uv run server.py / python server.py To run the file: uv run client-http.py / python client-http.py """ nest_asyncio.apply() # Needed to run interactive python async def main(): # Connect to the server using Streamable HTTP async with http_client("http://localhost:8000/mcp") as streams: async with ClientSession(streams[0], streams[1]) as session: # Initialize the connection await session.initialize() # List available tools tools_result = await session.list_tools() print("Available tools:") for tool in tools_result.tools: print(f" - {tool.name}: {tool.description}") print("*****-----*****") print() # Call our Weather tool result = await session.call_tool("get_alerts", arguments={"state":"CA"}) print(f"The weather alerts are = \n{result.content[0].text}") if __name__ == "__main__": print("Running the client-http.py file") asyncio.run(main())

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/harishivam1411/mcp-tutorial'

If you have feedback or need assistance with the MCP directory API, please join our Discord server