Skip to main content
Glama

๐ŸŒฆ๏ธ MCP Weather Server

A simple and modular MCP (Modular Command Protocol) server that exposes weather-related tools โ€” perfect for integration with AI agents, LLMs, or any tool-using client.

This project demonstrates how to create and serve tools such as:

  • get_coordinates(city)

  • get_forecast(latitude, longitude)

Designed to be lightweight, clean, and easy to extend.


๐Ÿง  What Is MCP?

MCP (Modular Command Protocol) is a protocol for exposing tools (Python functions) in a machine-readable format so they can be:

  • Automatically discovered

  • Dynamically called by AI agents

  • Interoperable across systems

Itโ€™s built for tool-using LLMs, agents, and next-gen integrations.


Related MCP server: MCP Weather Server

๐Ÿ“ Project Structure

mcp-server/ โ”œโ”€โ”€ main.py # Starts the FastMCP server โ”œโ”€โ”€ tools |------ get_forcast.py # MCP tools: get_coordinates and get_forecast โ”œโ”€โ”€ pyproject.toml # Python dependencies โ””โ”€โ”€ README.md # You're here!

๐Ÿš€ Getting Started

1. Clone the Repo

git clone https://github.com/jeannassereldine/mcp-server.git cd mcp-server

3. Run the Server

uv run weather.py

This starts the MCP server over stdio. You can connect any MCP client that supports the protocol.


๐Ÿ”ง Tools Overview

get_coordinates(city: str) -> Tuple[float, float]

Returns hardcoded latitude and longitude for a given city.

โœ… Replace this with a real geolocation API like OpenCage or Google Maps.


get_forecast(latitude: float, longitude: float) -> str

Returns a formatted weather forecast string for the given coordinates.

โœ… Replace with a live weather API like api.weather.gov.


format_forecast(forecasts: List[Dict]) -> str

Helper function that formats multiple forecast entries into a readable string.


๐Ÿงฉ Want to Build an MCP Client?

Stay tuned! The next part of this project will include a lightweight client that can:

  • Auto-discover tools

  • Call them based on context

  • Build real-time agent workflows


๐Ÿง  Use Cases

  • Build agent backends with clean, callable tools

  • Expose local or cloud-based APIs to LLMs

  • Prototype tools for LangChain or OpenAI function-calling agents

  • Teach MCP integration through a practical example


๐Ÿ“Œ License

This project is open-source under the MIT License.


๐Ÿ‘‹ Contributing

Pull requests are welcome! Feel free to open issues or suggest features you'd like to see.


๐Ÿ”— Related

One-click Deploy
A
security โ€“ no known vulnerabilities
F
license - not found
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jeannassereldine/mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server