🌊 Ecopulse – Climate Risk Intelligence Platform

Designed and deployed the backend infrastructure and APIs for a flood risk awareness platform that translates complex climate data into plain-language, location-specific guidance for communities and SMEs across Africa.

Completed ✅ 🌍 SDG 13: Climate Action Women Techsters Fellowship

🎯 Problem & Objective

Across Africa, flooding causes widespread damage to homes, businesses, and livelihoods every year. The core issue is not a shortage of climate data. The issue is that the data is technical, generic, and disconnected from the people who need it most. Residents in flood prone areas receive warnings they cannot interpret, and small businesses get no meaningful guidance at all.

The objective of Ecopulse was to bridge that gap by building a platform that ingests rainfall and flood risk data, processes it, and delivers it back to users in plain language, specific to their location. My role was to design and deploy the backend infrastructure that made this possible.

🌍 SDG Alignment: SDG 13 — Climate Action

Ecopulse directly supports SDG 13, which calls for urgent action to combat climate change and its impacts, with a specific focus on building resilience and adaptive capacity in vulnerable communities. Flooding is one of the most visible consequences of climate change across the African continent, and the communities most affected are often the ones with the least access to clear, usable information. By translating technical climate data into accessible, location-specific guidance, Ecopulse supports early preparedness, reduces avoidable losses, and helps everyday people and businesses make informed decisions before disaster strikes.

🏗️ High-Level Architecture

The system is structured in four clear layers working together:

01

Data Pipeline

Processes CHIRPS rainfall data and outputs structured risk metrics per region

02

Backend API (FastAPI)

Serves processed data through clean, validated endpoints

03

AI Layer (OpenRouter / GPT-4o-mini)

Converts technical risk outputs into simple, user-friendly explanations

04

Frontend (React + Vite)

Presents information to end users via a mobile-friendly web interface

I owned the backend and infrastructure layer, and acted as the integration point between the data science pipeline, the AI layer, and the frontend team.

🧠 Key Design Decisions

RESTful API with FastAPI

Designed and implemented a full suite of structured endpoints covering the entire user journey:

Method Endpoint Purpose
POST /location/resolve Resolves a user's location input into a structured region
GET /countries Returns the list of available countries
GET /countries/{country_code}/regions Returns regions under a specific country
GET /risk/criteria Exposes the classification logic for transparency
GET /risk/high Aggregates all currently high-risk regions
GET /risk/{region_id} Retrieves flood risk data for a specific region
GET /risk/{region_id}/explain Returns an AI-generated plain-language explanation of the risk
POST /eco/chat Powers the ECO assistant for conversational flood risk queries
GET /health Health check endpoint for uptime monitoring
POST /subscribe Registers a user for flood alert notifications
DELETE /unsubscribe Removes a user from the alert system

Structured over Free-Text Location Input

Instead of relying on free-text location search, I implemented a country → region selection model using stable region_id identifiers. This eliminated ambiguity, improved data mapping accuracy, and made the API deterministic and reliable.

Database over File-Based Serving

Transitioned from serving raw .parquet files directly to structured Supabase database tables. This improved performance, enabled upsert operations with conflict resolution, and made the system more maintainable.

AI Integration via OpenRouter

Used OpenRouter with GPT-4o-mini rather than direct OpenAI integration, for flexibility and cost efficiency. The AI layer takes technical rainfall metrics and risk classifications and returns concise, accessible explanations for non-technical users.

Practical DevOps Decisions

Deployed the backend on Render with environment variable management and configured UptimeRobot for continuous health checks to prevent service sleep on the free tier. Intentionally avoided Docker and Kubernetes for the MVP, prioritizing stability and delivery speed over infrastructure complexity.

🛠️ Tools & Technologies

FastAPI (Python) Pydantic Supabase (PostgreSQL) Pandas Parquet CHIRPS Rainfall Data geoBoundaries (ADM2) OpenRouter API GPT-4o-mini Render Vercel GitHub Actions UptimeRobot Resend Git

🚧 Challenges Faced

💡 Key Learnings

✅ Outcome & Final Result

Ecopulse was successfully built and deployed as a live, cloud-based platform. The backend delivered reliable, structured flood risk data through a production-aware API, with AI-powered explanations that made technical climate information accessible to everyday users.

The system handled real geospatial data, integrated across four technical disciplines, and was delivered within the capstone timeline.

✨ Closing Reflection

This project strengthened my ability to design real-world backend systems, debug across application and database layers, and make practical DevOps decisions under pressure. Working across a cross-functional team with different technical backgrounds pushed me to communicate clearly, integrate thoughtfully, and deliver a production-aware system that solves a problem that actually matters.

Explore the raw build 👉🏽