Three Things We Learned Deploying 127 Sensors Across Nairobi
From site surveys to live data streams — the operational, technical, and human lessons from our largest IoT deployment to date.
Joel Kate
Co-Founder, Muran Systems

Deploying 127 low-cost air quality sensors across Nairobi with AQSEA was the most complex project we had undertaken. The brief was straightforward: instrument the city, make the data publicly accessible, keep the network running. The execution was anything but simple.
Here are three things we would do differently — and did do better — by the end.
1. Site surveys are not optional
We spent the first week of the project trying to shortcut site surveys. We had satellite imagery, we had maps, we had confident assumptions about where the sensors would go. We were wrong about roughly 40% of them.
Physical site surveys revealed problems that no amount of desk research could catch:
Rooftop access restrictions
Cellular dead zones in specific courtyards
Buildings under construction that would obstruct solar panels within six months
The time spent doing surveys properly in the first two weeks saved us from reinstallations that would have cost far more later. For any deployment above 20 nodes, physical site verification is non-negotiable.
2. Stakeholder communication is a technical requirement
Sensors installed in public and semi-public spaces require ongoing cooperation from building owners, facility managers, and local officials. We treated this as a one-time permission process in the early phase. It is not.
Staff changes. Building management turns over. The person who gave permission to mount a sensor on the roof in March may no longer work there in September, and the new facilities manager has no record of the arrangement.
We now maintain a stakeholder register for every deployment—contact details, relationship notes, last communication date—and schedule periodic check-ins. This is as much an engineering discipline as sensor calibration.
3. Data quality requires ongoing attention
Sensors drift. Dust accumulates on optical sensors. Firmware updates change sampling behaviour in subtle ways. The data stream that looked clean in month one will develop anomalies by month six if nobody is watching for them.
We run automated outlier detection on incoming readings and flag devices whose readings deviate significantly from nearby nodes. A sensor reporting zero PM2.5 in a busy road corridor is probably not measuring clean air—it is probably blocked or offline. Catching that quickly means less corrupt data in the downstream analysis. Real-time monitoring of data quality is part of the deployment cost, not optional maintenance.
The Nairobi air quality network is now one of the most densely instrumented low-cost sensor networks on the continent. The data is informing public health research, urban planning decisions, and community awareness. That outcome was worth the hard lessons.

