🌊 Ecopulse – Climate Risk Intelligence Platform
Designed and deployed the backend infrastructure and APIs for a flood risk awareness platform that translates complex climate data into plain-language, location-specific guidance for communities and SMEs across Africa.
🎯 Problem & Objective
Across Africa, flooding causes widespread damage to homes, businesses, and livelihoods every year. The core issue is not a shortage of climate data. The issue is that the data is technical, generic, and disconnected from the people who need it most. Residents in flood prone areas receive warnings they cannot interpret, and small businesses get no meaningful guidance at all.
The objective of Ecopulse was to bridge that gap by building a platform that ingests rainfall and flood risk data, processes it, and delivers it back to users in plain language, specific to their location. My role was to design and deploy the backend infrastructure that made this possible.
🌍 SDG Alignment: SDG 13 — Climate Action
Ecopulse directly supports SDG 13, which calls for urgent action to combat climate change and its impacts, with a specific focus on building resilience and adaptive capacity in vulnerable communities. Flooding is one of the most visible consequences of climate change across the African continent, and the communities most affected are often the ones with the least access to clear, usable information. By translating technical climate data into accessible, location-specific guidance, Ecopulse supports early preparedness, reduces avoidable losses, and helps everyday people and businesses make informed decisions before disaster strikes.
🏗️ High-Level Architecture
The system is structured in four clear layers working together:
Data Pipeline
Processes CHIRPS rainfall data and outputs structured risk metrics per region
Backend API (FastAPI)
Serves processed data through clean, validated endpoints
AI Layer (OpenRouter / GPT-4o-mini)
Converts technical risk outputs into simple, user-friendly explanations
Frontend (React + Vite)
Presents information to end users via a mobile-friendly web interface
I owned the backend and infrastructure layer, and acted as the integration point between the data science pipeline, the AI layer, and the frontend team.
🧠 Key Design Decisions
RESTful API with FastAPI
Designed and implemented a full suite of structured endpoints covering the entire user journey:
| Method | Endpoint | Purpose |
|---|---|---|
| POST | /location/resolve | Resolves a user's location input into a structured region |
| GET | /countries | Returns the list of available countries |
| GET | /countries/{country_code}/regions | Returns regions under a specific country |
| GET | /risk/criteria | Exposes the classification logic for transparency |
| GET | /risk/high | Aggregates all currently high-risk regions |
| GET | /risk/{region_id} | Retrieves flood risk data for a specific region |
| GET | /risk/{region_id}/explain | Returns an AI-generated plain-language explanation of the risk |
| POST | /eco/chat | Powers the ECO assistant for conversational flood risk queries |
| GET | /health | Health check endpoint for uptime monitoring |
| POST | /subscribe | Registers a user for flood alert notifications |
| DELETE | /unsubscribe | Removes a user from the alert system |
Structured over Free-Text Location Input
Instead of relying on free-text location search, I implemented a
country → region selection model using stable region_id identifiers.
This eliminated ambiguity, improved data mapping accuracy, and made the
API deterministic and reliable.
Database over File-Based Serving
Transitioned from serving raw .parquet files directly to
structured Supabase database tables. This improved performance, enabled
upsert operations with conflict resolution, and made the system more
maintainable.
AI Integration via OpenRouter
Used OpenRouter with GPT-4o-mini rather than direct OpenAI integration, for flexibility and cost efficiency. The AI layer takes technical rainfall metrics and risk classifications and returns concise, accessible explanations for non-technical users.
Practical DevOps Decisions
Deployed the backend on Render with environment variable management and configured UptimeRobot for continuous health checks to prevent service sleep on the free tier. Intentionally avoided Docker and Kubernetes for the MVP, prioritizing stability and delivery speed over infrastructure complexity.
🛠️ Tools & Technologies
🚧 Challenges Faced
- Dependency conflicts on deployment: Resolved Supabase and HTTPX version conflicts that were causing deployment failures on Render.
- Data structure gaps: The pipeline output was missing location metadata needed by the frontend. I enriched the dataset by merging ADM2 administrative boundary data including country names, region names, and country codes.
- Route conflicts in FastAPI: Debugged and resolved conflicting route definitions that were causing incorrect endpoint resolution.
- Branch misalignment: Resolved Git branch conflicts across a multi-person team working on overlapping parts of the codebase.
- Pipeline stability: Debugged and stabilized the data pipeline execution, and pivoted the rainfall ingestion strategy from STAC-based API queries to local GeoTIFF ingestion for better determinism and reproducibility.
💡 Key Learnings
- Bridging data science and backend engineering requires deliberate architecture decisions, not just working code.
- Keeping layers cleanly separated made debugging faster and collaboration smoother across the team.
- Practical DevOps under constraints means knowing what not to build just as much as what to build.
- Cross-functional collaboration across backend, data science, frontend, and project management disciplines sharpens your ability to communicate decisions clearly and build with the full system in mind.
✅ Outcome & Final Result
Ecopulse was successfully built and deployed as a live, cloud-based platform. The backend delivered reliable, structured flood risk data through a production-aware API, with AI-powered explanations that made technical climate information accessible to everyday users.
The system handled real geospatial data, integrated across four technical disciplines, and was delivered within the capstone timeline.
✨ Closing Reflection
This project strengthened my ability to design real-world backend systems, debug across application and database layers, and make practical DevOps decisions under pressure. Working across a cross-functional team with different technical backgrounds pushed me to communicate clearly, integrate thoughtfully, and deliver a production-aware system that solves a problem that actually matters.