Alexander Güntert posted on LinkedIn about a new open-source project his colleague Hayal Oezkan had built: an MCP server for Zurich's open data. The post got quite some reactions and I liked the idea very much. But it still needed a local installation, not something non-developers easily know how to do. So I had it packaged and deployed on our servers, available for anyone to use as the "OGD City of Zurich" remote MCP server.

The City of Zurich publishes over 900 datasets as open data, spread across six different APIs. There's CKAN for the main data catalog, a WFS Geoportal for geodata, the Paris API for parliamentary information from the Gemeinderat, a tourism API, SPARQL linked data, and ParkenDD for real-time parking data. All public, all freely available. But until now, making an AI assistant actually use these APIs meant writing custom integrations for each one.

The MCP server wraps all six APIs into 20 tools that any MCP-compatible AI assistant can call directly. Ask "How warm is it in Zurich right now?" and it queries the live weather stations. Ask about parking availability, and it pulls real-time data from 36 parking garages. It also covers parliamentary motions, tourism recommendations, SQL queries on the data store, and GeoJSON features for school locations, playgrounds, or climate data. All through a single, standardized Model Context Protocol interface.

Hayal Oezkan built it in Python using FastMCP. One file for the server with all 20 tool handlers. The repo is on GitHub.

Deploying it on our side took very little effort. The server supports both stdio transport for local use (like in Claude Desktop or Claude Code) and SSE and HTTP Streaming for remote deployment. I packaged it with Docker, deployed it to our cluster, and now it's available as a remote MCP server that anyone can add to their AI tools without installing anything locally.

The natural next step is integrating this MCP server into ZüriCityGPT. Right now, the chatbot answers questions mainly based on crawled website content (apart from for eample waste collection dates). With the MCP server, it could pull live data directly, actual numbers from actual APIs. I hope I'll find the soon the time to do that.

A city employee builds something useful in the open, publishes the code, and within a day it's deployed and available to a wider audience. Open data and open source working together, exactly as intended.