🔧 The Hardest Part of My Backend Project Wasn't the Backend 🌍
What building a climate risk API for Ecopulse taught me about data, systems thinking, and the gap between "it works" and production-ready.
1️⃣ It Started Smoothly — Too Smoothly
When I started building the backend for Ecopulse — a climate risk platform that surfaces drought and flood risk data for Nigerian regions — things moved fast.
FastAPI endpoints stood up cleanly. Supabase connected without drama. The deployment to Render went smoothly.
I remember thinking: this is going well.
Then I touched the data. And everything changed.
2️⃣ The Illusion of "API-Ready" Data 🗂️
The data science team was processing satellite raster files, running CHIRPS ingestion pipelines, and aggregating ADM2 geospatial boundaries. That's a different world — and at some point, my backend had to meet theirs.
The dataset they produced was analytically sound. But when I tried to serve it through the API, things broke in ways I didn't expect.
Region identifiers came through as internal geoBoundaries ADM2 IDs. The frontend expected human-readable names: Ikeja, Kano, Maiduguri. There was no direct mapping.
The /explain endpoint started throwing 404s. Users would have seen "region not found" on a live product.
The fix wasn't more code on the API side — it was merging ADM2 metadata directly into the pipeline output so the bridge existed at the data level, not the serving level.
Lesson: Data that works for analysis is not automatically ready for product integration.
3️⃣ Complexity Is Not Always Your Responsibility 🧠
There were moments where the volume of technical detail from the data team felt genuinely overwhelming. GeoTIFF files, xarray processing, STAC API traversal, raster aggregation.
I started going down rabbit holes trying to understand all of it.
At some point, I had to make a deliberate call: stop trying to own the entire system.
My responsibility was the API layer — the serving layer. The data pipeline was someone else's domain. My job was to integrate cleanly with its outputs, not replicate or own its internals.
Not every complex component in a system is mine to build. My role was to integrate, not to replicate the data science pipeline.
Filtering what's mine to solve versus what I simply need to work with is one of the most underrated engineering skills I developed during this project.
4️⃣ The Architecture Decisions That Actually Mattered ⚖️
Two decisions shaped the project more than any individual line of code.
Separating pipeline from API. I kept the data science logic strictly in the ingestion pipeline and kept the API as a clean serving layer. This meant the DS team could update their processing logic without touching my endpoints — and I could evolve the API without breaking their pipeline.
Choosing deterministic over dynamic. There was pressure to add real-time location detection — grab the user's coordinates, reverse geocode, return regional data automatically. I pushed back. Under a deadline, introducing a third-party dependency with unpredictable behaviour late in the project was a risk I wasn't willing to take.
I kept the flow explicit: country → region → result. No ambiguity, no hidden dependencies.
Lesson: Good engineering is not about using every available tool. It's about choosing the simplest architecture that reliably solves the problem.
5️⃣ Deployment Is Where Assumptions Break 🚀
The code worked locally. Then I deployed to Render — and things that had no reason to fail, failed.
Import paths that worked on my machine broke in the deployment environment because of how Python resolves relative paths differently across contexts. A dependency that installed cleanly locally conflicted on the server.
The virtual environment got corrupted mid-project and rebuilding it from scratch turned out to be faster than debugging it.
Render's free tier put the service to sleep after inactivity — I solved that with UptimeRobot.
None of these were glamorous problems. But they were the problems that actually blocked deployment. Solving them required a different kind of thinking — not algorithmic, but operational.
Local success is not a guarantee of production success. Environment setup is part of the job.
6️⃣ Git Doesn't Make Mistakes 🌿
There was a stretch where I was pushing changes, seeing "Everything up-to-date" in the terminal, and watching nothing change on the live service.
It took longer than I'd like to admit to realise I was working on develop and the deployment was pulling from main.
This experience exposed something more important than a branching mistake: in a multi-person project, you can't assume any branch has the full picture. Code evolves independently across branches, and integration timing matters in ways that aren't always visible until something breaks.
Git doesn't make mistakes. It follows instructions exactly. The confusion was always mine.
7️⃣ The AI Layer Was the Easy Part 🤖
Most people assume the GPT integration would be the hardest part of a project like this. It wasn't.
Calling the API, passing structured input, getting a completion back — that was mechanical. What took real thought was the context design: shaping the prompt so the model returned something actually useful, setting the right constraints (word count, tone, specificity), and making sure the input data feeding the AI was clean and consistent.
The value of AI in applications comes from how you shape the input, not just the model you choose.
The lesson wasn't about AI. It was about where real complexity lives in a system — and it's almost never where you expect it.
🎯 What I'd Do Differently
I'd align on data contracts earlier — before writing a single endpoint, I'd confirm exactly what the data team's output structure would look like and design around that.
I'd also treat production parity as a first-class concern from day one, not something to sort out at the end.
But honestly? I'm glad the project was hard in the ways it was.
The challenges that felt frustrating in the moment are the ones that changed how I think about building systems — not just coding them.
This project marked a transition from writing code to designing systems. Those are not the same thing.